Question 1 (Show main steps of your work to get full points)

The \((X'X)^{-1}\) for the \(y=β_0+β_1 x_1+β_2 x_2+β_3 x_3+β_4 x_4+β_5 x_5+β_6 x_6+ε\) is given below.

  1. If MSE = 1.395 and n = 38 , compute the (Keep 4 or more decimal places, DO NOT round in the intermediate steps)

  2. se(β ̂_4)

\[se(\mathbf{\hat\beta_4})=\sqrt{MSE\times C_{55}}=\sqrt{1.395\times0.069}=0.3102499\]

  1. Cov(β ̂_2,β ̂_4)

\[Cov(\mathbf{\hat\beta_2,\hat\beta_4})=MSE\times C_{35}=1.395\times(-0.035)=-0.048825\]

  1. Cor(β ̂_2,β ̂_4 )

\[se(\mathbf{\hat\beta_2})=\sqrt{MSE\times C_{33}}=\sqrt{1.395\times0.067}=0.3057205\]

\[Cor(\mathbf{\hat\beta_2,\hat\beta_4})=\frac{Cov(\mathbf{\hat\beta_2,\hat\beta_4})}{se(\mathbf{\hat\beta_2})se(\mathbf{\hat\beta_4})}=\frac{-0.048825}{0.3057205\times0.3102499}=-0.5147615\]

  1. Without computing anything, explain which estimator is the most consistent.

\(C_{66}=0.058\) has the smallest value. \(\hatβ_5\) has the the least variance and the most consistent among the estimators.

  1. Without computing anything , list the pair(s) of estimators that are positively correlated. Provide a reason.

According to the \((X'X)^{(-1)}\),

\(C_{13},\ C_{17},\ C_{24},\ C_{25},\ C_{67}\) are positive.

The positively correlated pairs of parameters are

\(\hatβ_0\) and \(\hatβ_2\), \(\hatβ_0\) and \(\hatβ_6\), \(\hatβ_1\) and \(\hatβ_3\), \(\hatβ_1\) and \(\hatβ_4\), \(\hatβ_5\) and \(\hatβ_6\).

  1. Consider the following hypothesis: \(H_0: β_1=2β_3,β_2=β_3,β_5=0\)

  2. Report the T matrix, β vector and c vector along with their dimensions, and the rank of T matrix for testing the above hypothesis.

\[ \mathbf{T}=\begin{bmatrix} 0 & 1 & 0 & -2 & 0 & 0& 0 \\ 0 & 0 & 1 & -1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 1 & 0 \end{bmatrix}_{3\times7} \mathbf{β}=\begin{bmatrix} \beta_0 \\ \beta_1 \\ \beta_2 \\ \beta_3 \\ \beta_4 \\ \beta_5 \\ \beta_6 \end{bmatrix}_{7\times1} \mathbf{C}=\begin{bmatrix} 0 \\ 0 \\ 0\end{bmatrix}_{3\times1} rank(T)=3 \]

  1. Report the values of numerator and denominator degrees of freedom for the corresponding F test. For F test, the numerator is MSR while denominator is MSE, thus

In this hypothesis,\(y=β_0+2β_3x_1+β_3x_2+β_3x_3+β_4x_4+0x_5+β_6x_6+ε=β_0+β_3(2x_1+x_2+x_3)+β_4x_4+β_6x_6+ε\)

The value of numerator is \(r=df_{Reduced}-df_{Full}=n-(3+1)-[n-(6+1)]=3\)

The denominator degrees of freedom is \(df_{Full}=n-(k+1)=38-(6+1)=31\)

  1. Show the following equation is an alternative form of the sum of squares of regreesion or model (SSR).

\[SSR=\sum_{i=1}^n(\hat y_i-\bar y)^2=\sum_{i=1}^n(\hat y_i^2-2\hat y_i\bar y+\bar y^2)=\sum_{i=1}^n\hat y_i^2-2\bar y\sum_{i=1}^n\hat y_i+\sum_{i=1}^n\bar y^2\]

\[=\sum_{i=1}^n\hat y_i^2-2\bar yn\frac{\sum_{i=1}^n\hat y_i}n+n\bar y^2=\sum_{i=1}^n\hat y_i^2-2\bar yn\bar y+n\bar y^2=\sum_{i=1}^n\hat y_i^2-n\bar y^2\]

Question 2 (Use software to analyze the given data)

The data in the WaterFlow file are simulated data on peak rate of flow (in cfs) of water from six watersheds following storm episodes. The predictors are:

x1 : Area of watershed (mi2) x2 : Area impervious to water (mi2)
x3 : Average slope of watershed (percent)
x4 : Longest stream flow in watershed (1000s of feet)
x5 : surface absorbency index, (0= complete absorbency, 100=no absorbency)
x6 : estimated soil storage capacity (inches of water)
x7 : Infiltration rate of water into soil (inches/hour)
x8 : Rainfall (inches)
x9 : Time period during which rainfall exceeded ¼ inch/hour

  1. Create the matrix of scatterplots and compute the correlation matrix for all the variables. Copy and paste them here.


  1. Based on scatterplots and correlation, explain which predictors are significantly related to (most likely to contribute to the variation in) the response variable.

Based on scatterplots and correlation, X2(0.666),X7(0.668),X1(0.781),X4(0.866) have medium to strong positive linear relationship to the response variable (Correlation coefficient is more than 0.6). X5(-0.62) have medium negative linear relationship to the response variable.


  1. Fit the full model.

\[\hat y=292.561-203.144X_1+ 1055.782X_2-49.24X_3+209.762X_4-10.197X_5-24.558X_6+142.778X_7+511.713X_8-301.872X_9\]


  1. Explain whether the overall model is significant at 5% significance level.

The fitted model is statistically significant at 5% significance level (p-value=0.0000). But most of the coefficients are not significent. This model is not the best fitted model.


  1. Explain whether assumptions of random errors and model are satisfied. If there is a violation of those, then suggest reasonable methods to correct them.
  • Residual Diagnostics: Use plots to examine residuals to validate OLS assumptions

There is some violation of assumptions about the errors:

On the residual plot, there is a funnel pattern.

On the outlier and leverage plot, there are two outlier.

On the qq plot, most of points follow approximately straight line but have some positive skew.

  • Suggestion: Transform and other diagnostics.

I suggest using natural log of response to make a variance-stabilizing transformations.

Other diagnostics of heteroskedasticity, variable selection, measures of influence also should be considered.


  1. How much of the sum of squares is explained by rainfall, given that all the other regression coefficients are in the model?

Accroding to the F test, the partial sum of squares explained by rainfall is 2209825, given that all the other regression coefficients are in the model.


  1. Explain whether there is a problem of multicollinearity.
  • Collinearity diagnostics:

The model does have serious problems of multicollinearity. The VIF of variables X4(105.754708), X1(101.859709), X3(31.446394 ), X7(20.53505) are larger than 10.

It will be important to solve multicollinearity. However, X7, X1, and X4 have medium to strong positive linear relationship to the response variable. It is also dangerous to remove these variables. We should have more diagnostics and comparisons.


  1. Interpret the estimated coefficient of rainfall predictor of the full model using question context.

Coefficient of 511.713 suggests the peak rate of flow increases by 511.713 cubic feet per second when the rainfall increases by 1 inch and other variables are constants.


  1. Create a new variable using natural log of response. Then fit the full model using this new variable as response.

\[\hat y=3.402256-0.013532X_1-1.023664X_2+0.177966X_3+0.108788X_4-0.009622X_5-0.389474X_6+4.233475X_7+0.63007X_8-0.462276X_9\]

  1. Explain whether the overall model is significant at 5% significance level.

The fitted model is statistically significant at 5% significance level (p-value=0000). But most of the coefficients are not significent. This model is not the best fitted model.


  1. Explain whether there is a problem of multicollinearity.

The variance-stabilizing transformations does not change the problem of multicollinearit.

The model still has serious problems of multicollinearity. The VIF of variables X4(105.754708), X1(101.859709), X3(31.446394 ), X7(20.53505) are larger than 10.

It will be important to solve multicollinearity. However, X7, X1, and X4 have medium to strong positive linear relationship to the response variable. It is also dangerous to remove these variables. We should have more diagnostics and comparisons.


  1. If you wanted to simplify this full model, explain which predictor you would eliminate first.

If

  • Residual Diagnostics:

Includes plots to examine residuals to validate OLS assumptions

There is no violation of assumptions about the errors (no pattern on residual plots and points follow approximately straight line on the qq plot).

  • Variable selection:

Differnt variable selection procedures such as all possible regression, best subset regression, stepwise regression, stepwise forward regression and stepwise backward regression

  • Heteroskedasticity:

Tests for heteroskedasticity include bartlett test, breusch pagan test, score test and f test

  • Measures of influence:

Use different plots to detect and identify influential observations

  • Collinearity diagnostics:

VIF, Tolerance and condition indices to detect collinearity and plots for assessing mode fit and contributions of variables

The general approaches for dealing with multicollinearity include collecting additional data, model respecification (redefine the regressors, variable elimination), estimation methods (Ridge Regression, Principal-Component Regression)

“Variable elimination is often a highly effective technique. However, it may not provide a satisfactory solution if the regressors dropped from the model have significant explanatory power relative to the response y. That is, eliminating regressors to reduce multicollinearity may damage the predictive power of the model.” (p.304)

Predictor X1 is the number of rooms while X4 is the number of bedrooms in a house. A high correlation is expected between these two variables.

and hence predictor x6 is the first to remove. However, according to the variable names of x1 and x4, predictor x6 may contain same information contains in x7. The bedrooms are rooms in a house.

Further, according to the correlation coefficients, x7 is less correlated with y than x6 is correlated.

It is better to remove x7 first and check whether the multicollinearity is solved. If it was not solved, then definitely, x6 has to be removed first.

x1 : Area of watershed (mi2)
x8 : Rainfall (inches)

x9 : Time period during which rainfall exceeded ¼ inch/hour
x4 : Longest stream flow in watershed (1000s of feet)

x3 : Average slope of watershed (percent)

x2 : Area impervious to water (mi2)
x5 : surface absorbency index, (0= complete absorbency, 100=no absorbency)
x7 : Infiltration rate of water into soil (inches/hour)
x6 : estimated soil storage capacity (inches of water)


  1. Use the forward selection method to find the best model (use α=0.15) and report the final fitted model with estimated coefficients here.

Stepwise Forward Regression based on p values (use α=0.15)

Stepwise AIC Forwardd Regression

Full model

eliminated model


  1. Use the backward elimination method to find the best model (use α=0.05) and report the final fitted model with estimated coefficients here.

Stepwise Backward Regression based on p values (use α=0.05)

Stepwise AIC Backward Regression

Full model

eliminated model


  1. Use best subsets method (6 models from each size) to find the best model for these data and report the final fitted model with estimated coefficients here.

Full model

eliminated model


  1. If the final models in the previous 3 methods are different, compare their model adequacy and suggest one best model.

Both models do not have a problem of multicollinearity (VIF <10), and violation of assumptions about the errors (no pattern on residual plots and points follow approximately straight line on the qq plot).

The model with 4 predictors has a slightly higher (about by 2%) adjusted R square compared to the model with only x1 and x2. Further, x5 and x7 predictors are not statistically significant at 10% significance level (p values are 0.11479 and 0.10356, respectively). There is no significant pattern on the plot of studentized residuals versus predicted values from the model with only x1 and x2. The partial regression plots do not show nonlinear patterns and hence first-order terms are good enough.

Finally, the model with 2 predictors is simpler that model with 4 predictors. Therefore, the best model will be

  • Residual Diagnostics:

Includes plots to examine residuals to validate OLS assumptions

There is no violation of assumptions about the errors (no pattern on residual plots and points follow approximately straight line on the qq plot).

Residual QQ Plot Residual Normality Test Residual vs Fitted Values Plot Residual Histogram

  • Variable selection:

Differnt variable selection procedures such as all possible regression, best subset regression, stepwise regression, stepwise forward regression and stepwise backward regression

  • Heteroskedasticity:

Tests for heteroskedasticity include bartlett test, breusch pagan test, score test and f test

Bartlett Test Breusch Pagan Test Score Test F Test

  • Measures of influence:

Use different plots to detect and identify influential observations

Cook’s D Bar Plot Cook’s D Chart DFBETAs Panel DFFITs Plot Studentized Residual Plot Standardized Residual Chart Studentized Residuals vs Leverage Plot Deleted Studentized Residual vs Fitted Values Plot Hadi Plot Potential Residual Plot


  1. Provide complete ANOVA table for the best model. Provide partial sum of squares, estimated coefficients, standard errors, p-values, 95% Bonferroni joint confidence intervals for the coefficients of the best model. Provide in a tabular form clearly.

  1. How much variation in the response is explained by the best model after taking number of data and regression coefficients in to account?

  1. Report the PRESS statistic of the best model.

About 72.93% of variation in predicting the sale price of houses in Erie, Pennsylvania.


  1. Report the complete code along with output here.

(a) The matrix of scatterplots and the correlation matrix

(c) The fitted full model

# build the model
model_wf_full <- lm(y ~ X1 + X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9, data=table_wf)
ols_regress(model_wf_full)
##                           Model Summary                            
## ------------------------------------------------------------------
## R                       0.906       RMSE                  609.308 
## R-Squared               0.821       Coef. Var              47.188 
## Adj. R-Squared          0.741       MSE                371256.369 
## Pred R-Squared          0.618       MAE                   366.548 
## ------------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                  ANOVA                                   
## ------------------------------------------------------------------------
##                     Sum of                                              
##                    Squares        DF    Mean Square      F         Sig. 
## ------------------------------------------------------------------------
## Regression    34143007.990         9    3793667.554    10.218    0.0000 
## Residual       7425127.376        20     371256.369                     
## Total         41568135.367        29                                    
## ------------------------------------------------------------------------
## 
##                                        Parameter Estimates                                        
## -------------------------------------------------------------------------------------------------
##       model        Beta    Std. Error    Std. Beta      t        Sig          lower        upper 
## -------------------------------------------------------------------------------------------------
## (Intercept)     292.561      4428.618                  0.066    0.948     -8945.373     9530.495 
##          X1    -203.144       410.268       -0.472    -0.495    0.626     -1058.947      652.660 
##          X2    1055.782      9833.700        0.028     0.107    0.916    -19456.957    21568.521 
##          X3     -49.240       156.200       -0.167    -0.315    0.756      -375.067      276.588 
##          X4     209.762       162.046        1.258     1.294    0.210      -128.259      547.783 
##          X5     -10.197        51.088       -0.059    -0.200    0.844      -116.764       96.370 
##          X6     -24.558       303.529       -0.012    -0.081    0.936      -657.709      608.592 
##          X7     142.778      3288.443        0.019     0.043    0.966     -6716.793     7002.349 
##          X8     511.713       209.741        0.541     2.440    0.024        74.200      949.226 
##          X9    -301.872       171.996       -0.398    -1.755    0.095      -660.649       56.905 
## -------------------------------------------------------------------------------------------------
model_wf_full%>% summary()
## 
## Call:
## lm(formula = y ~ X1 + X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9, 
##     data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1404.21  -318.77    74.73   266.66  1274.30 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)  
## (Intercept)   292.56    4428.62   0.066   0.9480  
## X1           -203.14     410.27  -0.495   0.6259  
## X2           1055.78    9833.70   0.107   0.9156  
## X3            -49.24     156.20  -0.315   0.7558  
## X4            209.76     162.05   1.294   0.2103  
## X5            -10.20      51.09  -0.200   0.8438  
## X6            -24.56     303.53  -0.081   0.9363  
## X7            142.78    3288.44   0.043   0.9658  
## X8            511.71     209.74   2.440   0.0241 *
## X9           -301.87     172.00  -1.755   0.0945 .
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 609.3 on 20 degrees of freedom
## Multiple R-squared:  0.8214, Adjusted R-squared:  0.741 
## F-statistic: 10.22 on 9 and 20 DF,  p-value: 9.744e-06
Anova(model_wf_full)
## Anova Table (Type II tests)
## 
## Response: y
##            Sum Sq Df F value  Pr(>F)  
## X1          91022  1  0.2452 0.62589  
## X2           4279  1  0.0115 0.91557  
## X3          36893  1  0.0994 0.75585  
## X4         622091  1  1.6756 0.21025  
## X5          14790  1  0.0398 0.84381  
## X6           2430  1  0.0065 0.93632  
## X7            700  1  0.0019 0.96580  
## X8        2209825  1  5.9523 0.02414 *
## X9        1143622  1  3.0804 0.09455 .
## Residuals 7425127 20                  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

(c) ii Residual diagnostics

#Model Fit Assessment
ols_plot_diagnostics(model_wf_full)

# Part & Partial Correlations
ols_test_correlation(model_wf_full) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.9710713
# Residual Normality Test
ols_test_normality(model_wf_full) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9589         0.2898 
## Kolmogorov-Smirnov        0.1423         0.5314 
## Cramer-von Mises          2.5333         0.0000 
## Anderson-Darling          0.5169         0.1748 
## -----------------------------------------------

(c) iii The partial regression and nonlinear diagnostics

#Lack of Fit F Test
ols_pure_error_anova(lm(y~X8, data = table_wf))
## Lack of Fit F Test 
## ---------------
## Response :   y 
## Predictor:   X8 
## 
##                        Analysis of Variance Table                         
## -------------------------------------------------------------------------
##                 DF      Sum Sq        Mean Sq      F Value       Pr(>F)   
## -------------------------------------------------------------------------
## X8               1     4616882.92    4616882.92    5.795558    0.02290414 
## Residual        28    36951252.44    1319687.59                           
##  Lack of fit    21    31374881.28    1494041.97    1.875466     0.2003839 
##  Pure Error      7     5576371.17     796624.45                           
## -------------------------------------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_full)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_full)

(c) iv Collinearity diagnostics

# for full model
ols_coll_diag(model_wf_full)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 9 x 3
##   Variables Tolerance    VIF
##   <chr>         <dbl>  <dbl>
## 1 X1          0.00982 102.  
## 2 X2          0.133     7.52
## 3 X3          0.0318   31.4 
## 4 X4          0.00946 106.  
## 5 X5          0.103     9.68
## 6 X6          0.433     2.31
## 7 X7          0.0487   20.5 
## 8 X8          0.182     5.50
## 9 X9          0.174     5.75
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##      Eigenvalue Condition Index    intercept           X1           X2           X3           X4           X5           X6           X7           X8           X9
## 1  8.3083720047        1.000000 8.462607e-06 5.328594e-05 6.355151e-04 0.0000760217 4.150013e-05 1.417617e-05 0.0008109678 0.0001066112 0.0003687855 0.0004151973
## 2  0.9146513033        3.013909 3.727606e-05 1.954057e-03 1.170272e-02 0.0005739095 4.651577e-04 1.081786e-04 0.0035456399 0.0001817229 0.0014730291 0.0011848948
## 3  0.3157077198        5.129976 6.956546e-06 1.601785e-05 8.257711e-05 0.0097446616 4.781482e-04 6.076062e-09 0.0081810709 0.0002927965 0.0175774490 0.0291828488
## 4  0.2027798531        6.400967 1.163563e-04 3.691673e-04 2.265111e-02 0.0069919127 2.025445e-03 4.207282e-04 0.0594054477 0.0037315932 0.0110815180 0.0196917886
## 5  0.1283540243        8.045503 1.755786e-04 5.211575e-03 9.236831e-02 0.0009818635 6.328213e-05 7.604150e-04 0.2692817619 0.0006913700 0.0007246206 0.0003261699
## 6  0.0839205416        9.950017 8.653324e-04 5.816786e-03 2.768292e-01 0.0026483249 1.709813e-03 1.680493e-03 0.0721703877 0.0040196108 0.0166605115 0.0059859638
## 7  0.0244574411       18.431151 1.228356e-03 3.944752e-02 5.573983e-02 0.0007406723 7.931546e-03 3.329908e-03 0.0049097386 0.1485064761 0.0995363708 0.0995227734
## 8  0.0172641365       21.937419 6.575462e-04 7.426534e-03 4.085809e-02 0.0003554064 4.944025e-03 2.364272e-04 0.0098681982 0.0384834510 0.7571879031 0.7489277292
## 9  0.0041546428       44.718901 7.061743e-03 1.591436e-01 4.939182e-02 0.1757973907 3.445438e-01 3.963894e-02 0.3794245842 0.2469894804 0.0193633680 0.0055230478
## 10 0.0003383328      156.706094 9.898424e-01 7.805615e-01 4.497409e-01 0.8020898367 6.377973e-01 9.538107e-01 0.1924022031 0.5569968879 0.0760264445 0.0892395862

(d) The fitted log model

# build full log model
model_wf_full_log <- lm(log(y) ~ X1 + X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9, data=table_wf)
ols_regress(model_wf_full_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.973       RMSE               0.433 
## R-Squared               0.947       Coef. Var          6.808 
## Adj. R-Squared          0.924       MSE                0.188 
## Pred R-Squared          0.886       MAE                0.265 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.635         9          7.515    40.002    0.0000 
## Residual        3.757        20          0.188                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                    Parameter Estimates                                    
## -----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig       lower     upper 
## -----------------------------------------------------------------------------------------
## (Intercept)     3.402         3.150                  1.080    0.293     -3.169     9.974 
##          X1    -0.014         0.292       -0.024    -0.046    0.963     -0.622     0.595 
##          X2    -1.024         6.995       -0.021    -0.146    0.885    -15.615    13.568 
##          X3     0.178         0.111        0.461     1.602    0.125     -0.054     0.410 
##          X4     0.109         0.115        0.498     0.944    0.357     -0.132     0.349 
##          X5    -0.010         0.036       -0.042    -0.265    0.794     -0.085     0.066 
##          X6    -0.389         0.216       -0.141    -1.804    0.086     -0.840     0.061 
##          X7     4.233         2.339        0.421     1.810    0.085     -0.646     9.113 
##          X8     0.630         0.149        0.508     4.223    0.000      0.319     0.941 
##          X9    -0.462         0.122       -0.465    -3.778    0.001     -0.717    -0.207 
## -----------------------------------------------------------------------------------------
#Model Fit Assessment
ols_plot_diagnostics(model_wf_full_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_full_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.9808603
# Residual Normality Test
ols_test_normality(model_wf_full_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9659         0.4341 
## Kolmogorov-Smirnov        0.0993         0.9007 
## Cramer-von Mises          4.948          0.0000 
## Anderson-Darling          0.3686         0.4062 
## -----------------------------------------------

(d) (2) Collinearity diagnostics

# for log model
ols_coll_diag(model_wf_full_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 9 x 3
##   Variables Tolerance    VIF
##   <chr>         <dbl>  <dbl>
## 1 X1          0.00982 102.  
## 2 X2          0.133     7.52
## 3 X3          0.0318   31.4 
## 4 X4          0.00946 106.  
## 5 X5          0.103     9.68
## 6 X6          0.433     2.31
## 7 X7          0.0487   20.5 
## 8 X8          0.182     5.50
## 9 X9          0.174     5.75
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##      Eigenvalue Condition Index    intercept           X1           X2           X3           X4           X5           X6           X7           X8           X9
## 1  8.3083720047        1.000000 8.462607e-06 5.328594e-05 6.355151e-04 0.0000760217 4.150013e-05 1.417617e-05 0.0008109678 0.0001066112 0.0003687855 0.0004151973
## 2  0.9146513033        3.013909 3.727606e-05 1.954057e-03 1.170272e-02 0.0005739095 4.651577e-04 1.081786e-04 0.0035456399 0.0001817229 0.0014730291 0.0011848948
## 3  0.3157077198        5.129976 6.956546e-06 1.601785e-05 8.257711e-05 0.0097446616 4.781482e-04 6.076062e-09 0.0081810709 0.0002927965 0.0175774490 0.0291828488
## 4  0.2027798531        6.400967 1.163563e-04 3.691673e-04 2.265111e-02 0.0069919127 2.025445e-03 4.207282e-04 0.0594054477 0.0037315932 0.0110815180 0.0196917886
## 5  0.1283540243        8.045503 1.755786e-04 5.211575e-03 9.236831e-02 0.0009818635 6.328213e-05 7.604150e-04 0.2692817619 0.0006913700 0.0007246206 0.0003261699
## 6  0.0839205416        9.950017 8.653324e-04 5.816786e-03 2.768292e-01 0.0026483249 1.709813e-03 1.680493e-03 0.0721703877 0.0040196108 0.0166605115 0.0059859638
## 7  0.0244574411       18.431151 1.228356e-03 3.944752e-02 5.573983e-02 0.0007406723 7.931546e-03 3.329908e-03 0.0049097386 0.1485064761 0.0995363708 0.0995227734
## 8  0.0172641365       21.937419 6.575462e-04 7.426534e-03 4.085809e-02 0.0003554064 4.944025e-03 2.364272e-04 0.0098681982 0.0384834510 0.7571879031 0.7489277292
## 9  0.0041546428       44.718901 7.061743e-03 1.591436e-01 4.939182e-02 0.1757973907 3.445438e-01 3.963894e-02 0.3794245842 0.2469894804 0.0193633680 0.0055230478
## 10 0.0003383328      156.706094 9.898424e-01 7.805615e-01 4.497409e-01 0.8020898367 6.377973e-01 9.538107e-01 0.1924022031 0.5569968879 0.0760264445 0.0892395862
# remove X4
ols_vif_tol(lm(log(y) ~ X1 + X2 + X3 + X5 + X6 + X7 + X8 + X9, data=table_wf))
## # A tibble: 8 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X1            0.119  8.38
## 2 X2            0.209  4.79
## 3 X3            0.379  2.64
## 4 X5            0.187  5.35
## 5 X6            0.836  1.20
## 6 X7            0.165  6.05
## 7 X8            0.187  5.35
## 8 X9            0.183  5.45
# remove X1
ols_vif_tol(lm(log(y) ~ X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9, data=table_wf))
## # A tibble: 8 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X2            0.245  4.08
## 2 X3            0.318  3.14
## 3 X4            0.115  8.70
## 4 X5            0.283  3.54
## 5 X6            0.717  1.39
## 6 X7            0.118  8.46
## 7 X8            0.190  5.27
## 8 X9            0.185  5.41
# remove X3
ols_vif_tol(lm(log(y) ~ X1 + X2 + X4 + X5 + X6 + X7 + X8 + X9, data=table_wf))
## # A tibble: 8 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X1           0.0983 10.2 
## 2 X2           0.243   4.11
## 3 X4           0.113   8.87
## 4 X5           0.272   3.68
## 5 X6           0.767   1.30
## 6 X7           0.206   4.85
## 7 X8           0.190   5.26
## 8 X9           0.187   5.36
# remove X7
ols_vif_tol(lm(log(y) ~ X1 + X2 + X3 + X4 + X5 + X6 + X8 + X9, data=table_wf))
## # A tibble: 8 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X1           0.0238 42.0 
## 2 X2           0.310   3.22
## 3 X3           0.135   7.42
## 4 X4           0.0321 31.1 
## 5 X5           0.164   6.09
## 6 X6           0.740   1.35
## 7 X8           0.184   5.45
## 8 X9           0.177   5.66
# remove X5
ols_vif_tol(lm(log(y) ~ X1 + X2 + X3 + X4 + X6 + X7 + X8 + X9, data=table_wf))
## # A tibble: 8 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X1           0.0269 37.2 
## 2 X2           0.210   4.77
## 3 X3           0.0836 12.0 
## 4 X4           0.0171 58.4 
## 5 X6           0.485   2.06
## 6 X7           0.0774 12.9 
## 7 X8           0.200   5.01
## 8 X9           0.190   5.25
# build X4 eliminated log model
model_wf_rm4_log <- lm(log(y) ~ X1 + X2 + X3 + X5 + X6 + X7 + X8 + X9, data=table_wf)
ols_regress(model_wf_rm4_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.972       RMSE               0.432 
## R-Squared               0.945       Coef. Var          6.790 
## Adj. R-Squared          0.924       MSE                0.187 
## Pred R-Squared          0.890       MAE                0.275 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.468         8          8.433    45.126    0.0000 
## Residual        3.925        21          0.187                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                    Parameter Estimates                                    
## -----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig       lower     upper 
## -----------------------------------------------------------------------------------------
## (Intercept)     1.200         2.110                  0.568    0.576     -3.189     5.588 
##          X1     0.250         0.083        0.444     2.998    0.007      0.077     0.424 
##          X2    -5.001         5.568       -0.101    -0.898    0.379    -16.581     6.578 
##          X3     0.278         0.032        0.721     8.675    0.000      0.212     0.345 
##          X5     0.013         0.027        0.058     0.494    0.626     -0.043     0.069 
##          X6    -0.531         0.155       -0.192    -3.424    0.003     -0.853    -0.208 
##          X7     6.088         1.266        0.605     4.809    0.000      3.455     8.721 
##          X8     0.606         0.147        0.489     4.134    0.000      0.301     0.912 
##          X9    -0.436         0.119       -0.438    -3.669    0.001     -0.683    -0.189 
## -----------------------------------------------------------------------------------------
# build X1 eliminated log model
model_wf_rm1_log <- lm(log(y) ~ X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9, data=table_wf)
ols_regress(model_wf_rm1_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.973       RMSE               0.423 
## R-Squared               0.947       Coef. Var          6.644 
## Adj. R-Squared          0.927       MSE                0.179 
## Pred R-Squared          0.896       MAE                0.264 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.635         8          8.454    47.247    0.0000 
## Residual        3.758        21          0.179                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                    Parameter Estimates                                    
## -----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig       lower     upper 
## -----------------------------------------------------------------------------------------
## (Intercept)     3.280         1.690                  1.941    0.066     -0.233     6.794 
##          X2    -1.243         5.025       -0.025    -0.247    0.807    -11.694     9.208 
##          X3     0.183         0.034        0.473     5.336    0.000      0.112     0.254 
##          X4     0.104         0.032        0.474     3.213    0.004      0.037     0.171 
##          X5    -0.008         0.021       -0.036    -0.386    0.703     -0.053     0.036 
##          X6    -0.396         0.164       -0.143    -2.416    0.025     -0.736    -0.055 
##          X7     4.317         1.466        0.429     2.945    0.008      1.268     7.365 
##          X8     0.629         0.142        0.507     4.413    0.000      0.332     0.925 
##          X9    -0.461         0.116       -0.463    -3.980    0.001     -0.702    -0.220 
## -----------------------------------------------------------------------------------------
library(huxtable)
huxreg(model_wf_full, model_wf_full_log, model_wf_rm4_log, model_wf_rm1_log)
(1) (2) (3) (4)
(Intercept) 292.561   3.402     1.200     3.280    
(4428.618)  (3.150)    (2.110)    (1.690)   
X1 -203.144   -0.014     0.250 **          
(410.268)  (0.292)    (0.083)            
X2 1055.782   -1.024     -5.001     -1.243    
(9833.700)  (6.995)    (5.568)    (5.025)   
X3 -49.240   0.178     0.278 *** 0.183 ***
(156.200)  (0.111)    (0.032)    (0.034)   
X4 209.762   0.109              0.104 ** 
(162.046)  (0.115)             (0.032)   
X5 -10.197   -0.010     0.013     -0.008    
(51.088)  (0.036)    (0.027)    (0.021)   
X6 -24.558   -0.389     -0.531 **  -0.396 *  
(303.529)  (0.216)    (0.155)    (0.164)   
X7 142.778   4.233     6.088 *** 4.317 ** 
(3288.443)  (2.339)    (1.266)    (1.466)   
X8 511.713 * 0.630 *** 0.606 *** 0.629 ***
(209.741)  (0.149)    (0.147)    (0.142)   
X9 -301.872   -0.462 **  -0.436 **  -0.461 ***
(171.996)  (0.122)    (0.119)    (0.116)   
N 30       30         30         30        
R2 0.821   0.947     0.945     0.947    
logLik -228.856   -11.406     -12.059     -11.407    
AIC 479.712   44.811     44.118     42.815    
*** p < 0.001; ** p < 0.01; * p < 0.05.

(d) (3) Variable selection

#Lack of Fit F Test

ols_pure_error_anova(lm(y~X1, data = table_wf))
## Lack of Fit F Test 
## ---------------
## Response :   y 
## Predictor:   X1 
## 
##                          Analysis of Variance Table                          
## ----------------------------------------------------------------------------
##                 DF      Sum Sq         Mean Sq      F Value        Pr(>F)    
## ----------------------------------------------------------------------------
## X1               1    25376844.15    25376844.15    60.28234    1.856979e-08 
## Residual        28    16191291.22      578260.40                             
##  Lack of fit     4     6088095.39     1522023.85    3.615546      0.01920643 
##  Pure Error     24    10103195.83      420966.49                             
## ----------------------------------------------------------------------------
ols_pure_error_anova(lm(y~X4, data = table_wf))
## Lack of Fit F Test 
## ---------------
## Response :   y 
## Predictor:   X4 
## 
##                          Analysis of Variance Table                           
## -----------------------------------------------------------------------------
##                 DF      Sum Sq         Mean Sq       F Value        Pr(>F)    
## -----------------------------------------------------------------------------
## X4               1    31156255.72    31156255.72     76.53169    1.689561e-09 
## Residual        28    10411879.64      371852.84                              
##  Lack of fit     3      234313.14       78104.38    0.1918543       0.9009474 
##  Pure Error     25    10177566.50      407102.66                              
## -----------------------------------------------------------------------------
alias(lm(y ~ as.factor(X3) + as.factor(X4) + as.factor(X5) + as.factor(X6) + as.factor(X7), data=table_wf))
## Model :
## y ~ as.factor(X3) + as.factor(X4) + as.factor(X5) + as.factor(X6) + 
##     as.factor(X7)
## 
## Complete :
##                   (Intercept) as.factor(X3)6 as.factor(X3)6.5 as.factor(X3)7 as.factor(X3)15 as.factor(X4)2 as.factor(X5)60 as.factor(X5)65 as.factor(X5)70 as.factor(X6)1
## as.factor(X4)10    0           0              0                0              1               0              0               0               0               0            
## as.factor(X4)15    0           1              0                1              0               0              0               0               0               0            
## as.factor(X4)19    0           0              1                0              0              -1              0               0               0               0            
## as.factor(X5)62    0           1              0                0              0               0              0               0               0               0            
## as.factor(X5)67    0           0              0                1              0               0              0               0               0               0            
## as.factor(X5)68    0           0              0                0              1               1             -1              -1               0               0            
## as.factor(X5)80    1          -1             -1               -1             -1               0              0               0              -1               0            
## as.factor(X6)1.5   0           1              0                0              0               0              0               0               1               0            
## as.factor(X6)2     1          -1              0               -1             -1              -1              1               1              -1              -1            
## as.factor(X7)0.2   0           0              0                0              1               0              0               0               0               0            
## as.factor(X7)0.25  1          -1             -1               -1             -1               0              0               0               0               0            
## as.factor(X7)0.35  0           0              0                0             -1               0              1               1               0               0            
## as.factor(X7)0.5   0           0              1                1              0              -1              0               0               0               0            
## as.factor(X7)0.6   0           1              0                0              0               0              0               0               0               0
alias(lm(y ~ as.factor(X1) + as.factor(X8) , data=table_wf))
## Model :
## y ~ as.factor(X1) + as.factor(X8)
## 
## Complete :
##                  (Intercept) as.factor(X1)0.13 as.factor(X1)1 as.factor(X1)3 as.factor(X1)5 as.factor(X1)7 as.factor(X8)1.25 as.factor(X8)1.45 as.factor(X8)1.5 as.factor(X8)1.6 as.factor(X8)1.75 as.factor(X8)1.8 as.factor(X8)2.25 as.factor(X8)2.3 as.factor(X8)2.6 as.factor(X8)2.75 as.factor(X8)2.9 as.factor(X8)3.1 as.factor(X8)3.25 as.factor(X8)3.6 as.factor(X8)3.9 as.factor(X8)4 as.factor(X8)4.25 as.factor(X8)4.75 as.factor(X8)4.76 as.factor(X8)5 as.factor(X8)5.25
## as.factor(X8)4.2  0           0                 0              0              1              0              0                 0                -1                0                0                 0                0                 0                0               -1                 0                0                0                 0                0                0              0                 0                 0                 0              0
alias(lm(y ~ as.factor(X4) + as.factor(X9) , data=table_wf))
## Model :
## y ~ as.factor(X4) + as.factor(X9)
alias(lm(y ~ as.factor(X3) + as.factor(X6) + as.factor(X7) + as.factor(X8) + as.factor(X9) , data=table_wf))
## Model :
## y ~ as.factor(X3) + as.factor(X6) + as.factor(X7) + as.factor(X8) + 
##     as.factor(X9)
## 
## Complete :
##                   (Intercept) as.factor(X3)6 as.factor(X3)6.5 as.factor(X3)7 as.factor(X3)15 as.factor(X6)1 as.factor(X6)1.5 as.factor(X6)2 as.factor(X7)0.35 as.factor(X8)1.25 as.factor(X8)1.45 as.factor(X8)1.5 as.factor(X8)1.6 as.factor(X8)1.75 as.factor(X8)1.8 as.factor(X8)2.25 as.factor(X8)2.3 as.factor(X8)2.6 as.factor(X8)2.75 as.factor(X8)2.9 as.factor(X8)3.1 as.factor(X8)3.25 as.factor(X8)3.6 as.factor(X8)4 as.factor(X8)4.25 as.factor(X8)4.75 as.factor(X8)4.76 as.factor(X8)5 as.factor(X8)5.25 as.factor(X9)1.5
## as.factor(X9)2    -3           0              3                3              2               1              3                0              0                 0                 1                 1                3                1                 1                0                 0                0                0                -1                2                0                 2                0              0                 1                -1                 0              0                -1              
## as.factor(X9)2.4   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 1                0              0                 0                 0                 0              0                 0              
## as.factor(X9)3     1           0              0               -1              0              -1             -1               -1              0                 0                -1                 0                0                0                 0                0                 0               -1                1                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X9)3.4   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 1                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X9)3.5   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 1                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X9)3.7   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                1                 0                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X9)4     1           0             -1               -1             -1               0             -1                0              0                 0                 0                 0               -1                0                 0                0                 0                1                0                 0                0                1                -1                0              1                 0                 0                 0              0                 0              
## as.factor(X9)4.2   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                1              0                 0                 0                 0              0                 0              
## as.factor(X9)5     0           1              0                0              0               0              0                0              0                 0                 0                -1                0                0                 0                0                 0                0               -1                 0                0                0                 0                0              0                 0                 1                 0              0                 0              
## as.factor(X9)6     0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 1                 0                 0              1                 0              
## as.factor(X9)6.5   0           0              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 0                 0                 1              0                 0              
## as.factor(X7)0.2   0           0              0                0              1               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X7)0.25  1          -1             -1               -1             -1               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X7)0.5  -1           0              1                2              0               1              1                1             -1                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X7)0.6   0           1              0                0              0               0              0                0              0                 0                 0                 0                0                0                 0                0                 0                0                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X8)3.9   1           0              0               -1              0              -1             -1               -1              0                 0                -1                 0                0                0                 0                0                 0               -1                0                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X8)4.2   0           1              0                0              0               0              0                0              0                 0                 0                -1                0                0                 0                0                 0                0               -1                 0                0                0                 0                0              0                 0                 0                 0              0                 0              
## as.factor(X9)1     2          -1             -2               -1             -1               0             -1                1             -1                 0                 0                 0               -2               -1                -1               -1                 0                0                0                 0               -2               -1                -2               -1              0                -2                 0                -1             -1                 0

(d) (4) Forward selection

Stepwise Forward Regression for full model

# Stepwise Forward Regression based on p values (use α=0.15) #
ols_step_forward_p(model_wf_full_log, penter = 0.15)
## Forward Selection Method    
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X1 
## 2. X2 
## 3. X3 
## 4. X4 
## 5. X5 
## 6. X6 
## 7. X7 
## 8. X8 
## 9. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered: 
## 
## - X4 
## - X3 
## - X7 
## 
## No more variables to be added.
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.944       RMSE               0.549 
## R-Squared               0.890       Coef. Var          8.618 
## Adj. R-Squared          0.878       MSE                0.301 
## Pred R-Squared          0.854       MAE                0.414 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     63.565         3         21.188    70.378    0.0000 
## Residual        7.828        26          0.301                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                  Parameter Estimates                                  
## -------------------------------------------------------------------------------------
##       model     Beta    Std. Error    Std. Beta      t       Sig      lower    upper 
## -------------------------------------------------------------------------------------
## (Intercept)    2.872         0.547                 5.254    0.000     1.748    3.995 
##          X4    0.122         0.033        0.559    3.730    0.001     0.055    0.189 
##          X3    0.168         0.040        0.435    4.165    0.000     0.085    0.251 
##          X7    3.106         1.537        0.309    2.021    0.054    -0.053    6.266 
## -------------------------------------------------------------------------------------
## 
##                            Selection Summary                             
## ------------------------------------------------------------------------
##         Variable                  Adj.                                      
## Step    Entered     R-Square    R-Square     C(p)        AIC       RMSE     
## ------------------------------------------------------------------------
##    1    X4            0.8030      0.7960    48.8552    68.4060    0.7087    
##    2    X3            0.8731      0.8637    24.2129    57.2082    0.5792    
##    3    X7            0.8904      0.8777    19.6668    54.8305    0.5487    
## ------------------------------------------------------------------------
# Stepwise AIC Forward Regression #
ols_step_forward_aic(model_wf_full_log)
## Forward Selection Method 
## ------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X4 
## 5 . X5 
## 6 . X6 
## 7 . X7 
## 8 . X8 
## 9 . X9 
## 
## 
## Variables Entered: 
## 
## - X4 
## - X3 
## - X7 
## - X8 
## - X9 
## - X6 
## 
## No more variables to be added.
## 
##                        Selection Summary                        
## ---------------------------------------------------------------
## Variable      AIC      Sum Sq     RSS       R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## X4           68.406    57.330    14.063    0.80302      0.79599 
## X3           57.208    62.335     9.057    0.87313      0.86373 
## X7           54.830    63.565     7.828    0.89036      0.87771 
## X8           54.522    64.144     7.248    0.89848      0.88223 
## X9           44.504    66.537     4.856    0.93199      0.91782 
## X6           39.161    67.591     3.801    0.94675      0.93286 
## ---------------------------------------------------------------

Stepwise Forward Regression for X4 eliminated model

# Stepwise Forward Regression based on p values (use α=0.15) #
ols_step_forward_p(model_wf_rm4_log, penter = 0.15)
## Forward Selection Method    
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X1 
## 2. X2 
## 3. X3 
## 4. X5 
## 5. X6 
## 6. X7 
## 7. X8 
## 8. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered: 
## 
## - X1 
## - X3 
## - X7 
## - X6 
## - X8 
## - X9 
## 
## No more variables to be added.
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.971       RMSE               0.421 
## R-Squared               0.943       Coef. Var          6.618 
## Adj. R-Squared          0.928       MSE                0.178 
## Pred R-Squared          0.900       MAE                0.292 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.310         6         11.218    63.195    0.0000 
## Residual        4.083        23          0.178                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.307         0.410                  5.623    0.000     1.458     3.156 
##          X1     0.207         0.053        0.368     3.897    0.001     0.097     0.317 
##          X3     0.263         0.022        0.680    11.944    0.000     0.217     0.308 
##          X7     5.453         1.002        0.542     5.442    0.000     3.380     7.525 
##          X6    -0.532         0.144       -0.192    -3.688    0.001    -0.831    -0.234 
##          X8     0.613         0.137        0.495     4.462    0.000     0.329     0.897 
##          X9    -0.433         0.112       -0.435    -3.864    0.001    -0.665    -0.201 
## ----------------------------------------------------------------------------------------
## 
##                             Selection Summary                             
## -------------------------------------------------------------------------
##         Variable                  Adj.                                       
## Step    Entered     R-Square    R-Square      C(p)        AIC       RMSE     
## -------------------------------------------------------------------------
##    1    X1            0.5266      0.5097    154.8516    94.7131    1.0987    
##    2    X3            0.8121      0.7981     47.7988    68.9988    0.7050    
##    3    X7            0.8718      0.8570     26.9889    59.5306    0.5934    
##    4    X6            0.8932      0.8761     20.8073    56.0486    0.5523    
##    5    X8            0.9057      0.8860     18.0270    54.3108    0.5297    
##    6    X9            0.9428      0.9279      5.8470    41.3046    0.4213    
## -------------------------------------------------------------------------
# Stepwise AIC Forward Regression #
ols_step_forward_aic(model_wf_rm4_log)
## Forward Selection Method 
## ------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Entered: 
## 
## - X1 
## - X3 
## - X7 
## - X6 
## - X8 
## - X9 
## 
## No more variables to be added.
## 
##                        Selection Summary                        
## ---------------------------------------------------------------
## Variable      AIC      Sum Sq     RSS       R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## X1           94.713    37.594    33.799    0.52658      0.50967 
## X3           68.999    57.974    13.418    0.81205      0.79813 
## X7           59.531    62.237     9.155    0.87176      0.85696 
## X6           56.049    63.766     7.626    0.89318      0.87609 
## X8           54.311    64.660     6.733    0.90569      0.88604 
## X9           41.305    67.310     4.083    0.94281      0.92789 
## ---------------------------------------------------------------

Stepwise Forward Regression for X1 eliminated model

# Stepwise Forward Regression based on p values (use α=0.15) #
ols_step_forward_p(model_wf_rm1_log, penter = 0.15)
## Forward Selection Method    
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X2 
## 2. X3 
## 3. X4 
## 4. X5 
## 5. X6 
## 6. X7 
## 7. X8 
## 8. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered: 
## 
## - X4 
## - X3 
## - X7 
## 
## No more variables to be added.
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.944       RMSE               0.549 
## R-Squared               0.890       Coef. Var          8.618 
## Adj. R-Squared          0.878       MSE                0.301 
## Pred R-Squared          0.854       MAE                0.414 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     63.565         3         21.188    70.378    0.0000 
## Residual        7.828        26          0.301                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                  Parameter Estimates                                  
## -------------------------------------------------------------------------------------
##       model     Beta    Std. Error    Std. Beta      t       Sig      lower    upper 
## -------------------------------------------------------------------------------------
## (Intercept)    2.872         0.547                 5.254    0.000     1.748    3.995 
##          X4    0.122         0.033        0.559    3.730    0.001     0.055    0.189 
##          X3    0.168         0.040        0.435    4.165    0.000     0.085    0.251 
##          X7    3.106         1.537        0.309    2.021    0.054    -0.053    6.266 
## -------------------------------------------------------------------------------------
## 
##                            Selection Summary                             
## ------------------------------------------------------------------------
##         Variable                  Adj.                                      
## Step    Entered     R-Square    R-Square     C(p)        AIC       RMSE     
## ------------------------------------------------------------------------
##    1    X4            0.8030      0.7960    52.5895    68.4060    0.7087    
##    2    X3            0.8731      0.8637    26.6181    57.2082    0.5792    
##    3    X7            0.8904      0.8777    21.7454    54.8305    0.5487    
## ------------------------------------------------------------------------
# Stepwise AIC Forward Regression #
ols_step_forward_aic(model_wf_rm1_log)
## Forward Selection Method 
## ------------------------
## 
## Candidate Terms: 
## 
## 1 . X2 
## 2 . X3 
## 3 . X4 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Entered: 
## 
## - X4 
## - X3 
## - X7 
## - X8 
## - X9 
## - X6 
## 
## No more variables to be added.
## 
##                        Selection Summary                        
## ---------------------------------------------------------------
## Variable      AIC      Sum Sq     RSS       R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## X4           68.406    57.330    14.063    0.80302      0.79599 
## X3           57.208    62.335     9.057    0.87313      0.86373 
## X7           54.830    63.565     7.828    0.89036      0.87771 
## X8           54.522    64.144     7.248    0.89848      0.88223 
## X9           44.504    66.537     4.856    0.93199      0.91782 
## X6           39.161    67.591     3.801    0.94675      0.93286 
## ---------------------------------------------------------------

(d) (5) Backward selection

Stepwise Backward Regression for full model

# Stepwise Backward Regression based on p values (use α=0.05) #
ols_step_backward_p(model_wf_full_log, penter = 0.05)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X4 
## 5 . X5 
## 6 . X6 
## 7 . X7 
## 8 . X8 
## 9 . X9 
## 
## We are eliminating variables based on p value...
## 
## Variables Removed: 
## 
## - X1 
## - X2 
## - X5 
## 
## No more variables satisfy the condition of p value = 0.3
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.973       RMSE               0.407 
## R-Squared               0.947       Coef. Var          6.385 
## Adj. R-Squared          0.933       MSE                0.165 
## Pred R-Squared          0.908       MAE                0.273 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.591         6         11.265     68.16    0.0000 
## Residual        3.801        23          0.165                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.692         0.445                  6.046    0.000     1.771     3.613 
##          X3     0.184         0.032        0.476     5.698    0.000     0.117     0.251 
##          X4     0.109         0.026        0.499     4.244    0.000     0.056     0.162 
##          X6    -0.368         0.146       -0.133    -2.526    0.019    -0.669    -0.066 
##          X7     4.085         1.213        0.406     3.367    0.003     1.575     6.595 
##          X8     0.612         0.133        0.493     4.614    0.000     0.337     0.886 
##          X9    -0.448         0.108       -0.450    -4.135    0.000    -0.672    -0.224 
## ----------------------------------------------------------------------------------------
## 
## 
##                           Elimination Summary                           
## -----------------------------------------------------------------------
##         Variable                  Adj.                                     
## Step    Removed     R-Square    R-Square     C(p)       AIC       RMSE     
## -----------------------------------------------------------------------
##    1    X1            0.9474      0.9273    8.0021    42.8146    0.4230    
##    2    X2            0.9472      0.9304    6.0604    40.9019    0.4139    
##    3    X5            0.9468      0.9329    4.2345    39.1611    0.4065    
## -----------------------------------------------------------------------
# Stepwise AIC Backward Regression #
ols_step_backward_aic(model_wf_full_log)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X4 
## 5 . X5 
## 6 . X6 
## 7 . X7 
## 8 . X8 
## 9 . X9 
## 
## 
## Variables Removed: 
## 
## - X1 
## - X2 
## - X5 
## 
## No more variables to be removed.
## 
## 
##                   Backward Elimination Summary                   
## ---------------------------------------------------------------
## Variable       AIC       RSS     Sum Sq     R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## Full Model    44.811    3.757    67.635    0.94737      0.92369 
## X1            42.815    3.758    67.635    0.94737      0.92731 
## X2            40.902    3.769    67.624    0.94721      0.93042 
## X5            39.161    3.801    67.591    0.94675      0.93286 
## ---------------------------------------------------------------

Stepwise Backward Regression for X4 eliminated model

# Stepwise Backward Regression based on p values (use α=0.05) #
ols_step_backward_p(model_wf_rm4_log, penter = 0.05)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## We are eliminating variables based on p value...
## 
## Variables Removed: 
## 
## - X5 
## - X2 
## 
## No more variables satisfy the condition of p value = 0.3
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.971       RMSE               0.421 
## R-Squared               0.943       Coef. Var          6.618 
## Adj. R-Squared          0.928       MSE                0.178 
## Pred R-Squared          0.900       MAE                0.292 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.310         6         11.218    63.195    0.0000 
## Residual        4.083        23          0.178                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.307         0.410                  5.623    0.000     1.458     3.156 
##          X1     0.207         0.053        0.368     3.897    0.001     0.097     0.317 
##          X3     0.263         0.022        0.680    11.944    0.000     0.217     0.308 
##          X6    -0.532         0.144       -0.192    -3.688    0.001    -0.831    -0.234 
##          X7     5.453         1.002        0.542     5.442    0.000     3.380     7.525 
##          X8     0.613         0.137        0.495     4.462    0.000     0.329     0.897 
##          X9    -0.433         0.112       -0.435    -3.864    0.001    -0.665    -0.201 
## ----------------------------------------------------------------------------------------
## 
## 
##                           Elimination Summary                           
## -----------------------------------------------------------------------
##         Variable                  Adj.                                     
## Step    Removed     R-Square    R-Square     C(p)       AIC       RMSE     
## -----------------------------------------------------------------------
##    1    X5            0.9444      0.9267    7.2445    42.4657    0.4248    
##    2    X2            0.9428      0.9279    5.8470    41.3046    0.4213    
## -----------------------------------------------------------------------
# Stepwise AIC Backward Regression #
ols_step_backward_aic(model_wf_rm4_log)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Removed: 
## 
## - X5 
## - X2 
## 
## No more variables to be removed.
## 
## 
##                   Backward Elimination Summary                   
## ---------------------------------------------------------------
## Variable       AIC       RSS     Sum Sq     R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## Full Model    44.118    3.925    67.468    0.94503      0.92409 
## X5            42.466    3.970    67.422    0.94439      0.92669 
## X2            41.305    4.083    67.310    0.94281      0.92789 
## ---------------------------------------------------------------

Stepwise Backward Regression for X1 eliminated model

# Stepwise Backward Regression based on p values (use α=0.05) #
ols_step_backward_p(model_wf_rm1_log, penter = 0.05)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X2 
## 2 . X3 
## 3 . X4 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## We are eliminating variables based on p value...
## 
## Variables Removed: 
## 
## - X2 
## - X5 
## 
## No more variables satisfy the condition of p value = 0.3
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.973       RMSE               0.407 
## R-Squared               0.947       Coef. Var          6.385 
## Adj. R-Squared          0.933       MSE                0.165 
## Pred R-Squared          0.908       MAE                0.273 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.591         6         11.265     68.16    0.0000 
## Residual        3.801        23          0.165                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.692         0.445                  6.046    0.000     1.771     3.613 
##          X3     0.184         0.032        0.476     5.698    0.000     0.117     0.251 
##          X4     0.109         0.026        0.499     4.244    0.000     0.056     0.162 
##          X6    -0.368         0.146       -0.133    -2.526    0.019    -0.669    -0.066 
##          X7     4.085         1.213        0.406     3.367    0.003     1.575     6.595 
##          X8     0.612         0.133        0.493     4.614    0.000     0.337     0.886 
##          X9    -0.448         0.108       -0.450    -4.135    0.000    -0.672    -0.224 
## ----------------------------------------------------------------------------------------
## 
## 
##                           Elimination Summary                           
## -----------------------------------------------------------------------
##         Variable                  Adj.                                     
## Step    Removed     R-Square    R-Square     C(p)       AIC       RMSE     
## -----------------------------------------------------------------------
##    1    X2            0.9472      0.9304    7.0612    40.9019    0.4139    
##    2    X5            0.9468      0.9329    5.2440    39.1611    0.4065    
## -----------------------------------------------------------------------
# Stepwise AIC Backward Regression #
ols_step_backward_aic(model_wf_rm1_log)
## Backward Elimination Method 
## ---------------------------
## 
## Candidate Terms: 
## 
## 1 . X2 
## 2 . X3 
## 3 . X4 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Removed: 
## 
## - X2 
## - X5 
## 
## No more variables to be removed.
## 
## 
##                   Backward Elimination Summary                   
## ---------------------------------------------------------------
## Variable       AIC       RSS     Sum Sq     R-Sq      Adj. R-Sq 
## ---------------------------------------------------------------
## Full Model    42.815    3.758    67.635    0.94737      0.92731 
## X2            40.902    3.769    67.624    0.94721      0.93042 
## X5            39.161    3.801    67.591    0.94675      0.93286 
## ---------------------------------------------------------------

(d) (6) Best Subset Regression

# For full model #
k <- ols_step_best_subset(model_wf_full_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X4 0.803 0.796 0.772 48.9  68.4 -19.8 72.6 0.538 0.536 0.225  0.0186 
2 2 X3 X4 0.873 0.864 0.844 24.2  57.2 -30.5 62.8 0.373 0.369 0.155  0.0129 
3 3 X3 X4 X7 0.89  0.878 0.854 19.7  54.8 -32.7 61.8 0.348 0.341 0.143  0.012  
4 4 X1 X4 X8 X9 0.921 0.908 0.886 10.1  47   -38.1 55.4 0.272 0.264 0.111  0.00941
5 5 X3 X4 X7 X8 X9 0.932 0.918 0.892 7.85 44.5 -38.8 54.3 0.255 0.243 0.102  0.0088 
6 6 X3 X4 X6 X7 X8 X9 0.947 0.933 0.908 4.23 39.2 -39.7 50.4 0.217 0.204 0.0857 0.00751
7 7 X3 X4 X5 X6 X7 X8 X9 0.947 0.93  0.902 6.06 40.9 -36.8 53.5 0.236 0.217 0.0912 0.00816
8 8 X2 X3 X4 X5 X6 X7 X8 X9 0.947 0.927 0.896 8    42.8 -33.8 56.8 0.259 0.233 0.0977 0.00895
9 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 0.947 0.924 0.886 10    44.8 -30.8 60.2 0.286 0.25  0.105  0.00989
plot(k)

# For X4 eliminated model #
k <- ols_step_best_subset(model_wf_rm4_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X1 0.527 0.51  0.466 155    94.7 4.85 98.9 1.29  1.29  0.541  0.0447 
2 2 X3 X7 0.832 0.819 0.798 40.3  65.7 -23.2  71.3 0.495 0.49  0.206  0.0171 
3 3 X1 X3 X7 0.872 0.857 0.831 27    59.5 -29    66.5 0.408 0.399 0.168  0.0141 
4 4 X1 X3 X6 X7 0.893 0.876 0.845 20.8  56   -31.9  64.5 0.368 0.356 0.15   0.0127 
5 5 X1 X3 X7 X8 X9 0.909 0.89  0.857 16.8  53.2 -33.6  63.1 0.341 0.325 0.137  0.0118 
6 6 X1 X3 X6 X7 X8 X9 0.943 0.928 0.9   5.85 41.3 -38.9  52.5 0.233 0.219 0.092  0.00807
7 7 X1 X2 X3 X6 X7 X8 X9 0.944 0.927 0.897 7.24 42.5 -36.4  55.1 0.249 0.229 0.0961 0.00859
8 8 X1 X2 X3 X5 X6 X7 X8 X9 0.945 0.924 0.89  9    44.1 -33.7  58.1 0.27  0.243 0.102  0.00934
plot(k)

# For X1 eliminated model #
k <- ols_step_best_subset(model_wf_rm1_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X4 0.803 0.796 0.772 52.6  68.4 -20   72.6 0.538 0.536 0.225  0.0186 
2 2 X3 X4 0.873 0.864 0.844 26.6  57.2 -30.7 62.8 0.373 0.369 0.155  0.0129 
3 3 X3 X4 X7 0.89  0.878 0.854 21.7  54.8 -33   61.8 0.348 0.341 0.143  0.012  
4 4 X3 X4 X8 X9 0.916 0.902 0.879 13.7  49   -37.3 57.4 0.291 0.281 0.118  0.01   
5 5 X3 X4 X7 X8 X9 0.932 0.918 0.892 9.14 44.5 -39.4 54.3 0.255 0.243 0.102  0.0088 
6 6 X3 X4 X6 X7 X8 X9 0.947 0.933 0.908 5.24 39.2 -40.5 50.4 0.217 0.204 0.0857 0.00751
7 7 X3 X4 X5 X6 X7 X8 X9 0.947 0.93  0.902 7.06 40.9 -37.8 53.5 0.236 0.217 0.0912 0.00816
8 8 X2 X3 X4 X5 X6 X7 X8 X9 0.947 0.927 0.896 9    42.8 -35   56.8 0.259 0.233 0.0977 0.00895
plot(k)

(d) (7) Models Comparison

  • Additional Regression
# All Possible Regression for full model #
k <- ols_step_all_possible(model_wf_full_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X4 0.803   0.796   0.772  48.9  68.4 -19.8    72.6 0.538 0.536 0.225  0.0186 
2 1 X1 0.527   0.51    0.466  154    94.7 4.86   98.9 1.29  1.29  0.541  0.0447 
3 1 X5 0.523   0.506   0.472  155    94.9 5.08   99.2 1.3   1.3   0.545  0.0451 
4 1 X2 0.433   0.413   0.346  189    100   10      104   1.55  1.54  0.648  0.0535 
5 1 X7 0.35    0.327   0.272  221    104   14      108   1.78  1.77  0.743  0.0614 
6 1 X3 0.227   0.199   0.133  268    109   19.1    114   2.11  2.1   0.884  0.073  
7 1 X8 0.0407  0.00648 -0.0793 339    116   25.4    120   2.62  2.61  1.1    0.0906 
8 1 X9 0.0176  -0.0175  -0.104  347    117   26.1    121   2.68  2.67  1.12   0.0928 
9 1 X6 0.00292 -0.0327  -0.125  353    117   26.5    121   2.72  2.71  1.14   0.0942 
10 2 X3 X4 0.873   0.864   0.844  24.2  57.2 -30.5    62.8 0.373 0.369 0.155  0.0129 
11 2 X1 X4 0.869   0.859   0.839  25.8  58.2 -29.6    63.8 0.386 0.381 0.16   0.0133 
12 2 X3 X7 0.832   0.819   0.798  40    65.7 -23.2    71.3 0.495 0.49  0.206  0.0171 
13 2 X4 X7 0.817   0.804   0.774  45.5  68.2 -21      73.8 0.538 0.532 0.223  0.0186 
14 2 X1 X3 0.812   0.798   0.775  47.4  69   -20.3    74.6 0.553 0.547 0.23   0.0191 
15 2 X4 X9 0.807   0.793   0.758  49.4  69.8 -19.6    75.4 0.568 0.562 0.236  0.0196 
16 2 X4 X8 0.806   0.792   0.759  49.7  69.9 -19.5    75.6 0.571 0.564 0.237  0.0197 
17 2 X4 X5 0.805   0.79    0.763  50.2  70.1 -19.3    75.7 0.574 0.568 0.239  0.0198 
18 2 X2 X4 0.804   0.79    0.764  50.4  70.2 -19.2    75.8 0.576 0.569 0.239  0.0199 
19 2 X4 X6 0.803   0.788   0.756  50.9  70.4 -19      76   0.58  0.573 0.241  0.02   
20 2 X2 X3 0.709   0.688   0.644  86.4  82.1 -8.5    87.7 0.855 0.845 0.355  0.0295 
21 2 X2 X5 0.646   0.62    0.592  110    88   -3.02   93.6 1.04  1.03  0.433  0.036  
22 2 X1 X5 0.604   0.575   0.539  126    91.3 0.114  96.9 1.16  1.15  0.484  0.0402 
23 2 X5 X7 0.6     0.57    0.536  128    91.6 0.41   97.3 1.18  1.16  0.489  0.0407 
24 2 X3 X5 0.564   0.531   0.489  142    94.3 2.88   99.9 1.28  1.27  0.533  0.0444 
25 2 X5 X8 0.553   0.52    0.479  146    95   3.57   101   1.32  1.3   0.547  0.0455 
26 2 X5 X6 0.544   0.511   0.451  149    95.6 4.09   101   1.34  1.32  0.557  0.0463 
27 2 X1 X2 0.543   0.509   0.477  150    95.7 4.18   101   1.34  1.33  0.558  0.0465 
28 2 X1 X8 0.533   0.498   0.44   154    96.3 4.83   102   1.38  1.36  0.571  0.0475 
29 2 X1 X9 0.528   0.493   0.42   155    96.6 5.11   102   1.39  1.37  0.577  0.048  
30 2 X1 X6 0.528   0.493   0.429  156    96.7 5.13   102   1.39  1.37  0.577  0.048  
31 2 X1 X7 0.527   0.492   0.437  156    96.7 5.18   102   1.39  1.38  0.578  0.0481 
32 2 X5 X9 0.526   0.491   0.439  156    96.7 5.22   102   1.39  1.38  0.579  0.0482 
33 2 X2 X8 0.451   0.41    0.335  185    101   9.42   107   1.62  1.6   0.671  0.0558 
34 2 X2 X7 0.44    0.398   0.318  189    102   10      107   1.65  1.63  0.685  0.057  
35 2 X2 X9 0.435   0.393   0.301  191    102   10.2    108   1.66  1.64  0.69   0.0574 
36 2 X2 X6 0.433   0.391   0.309  191    102   10.3    108   1.67  1.65  0.693  0.0576 
37 2 X7 X8 0.361   0.314   0.249  219    106   13.8    111   1.88  1.86  0.781  0.065  
38 2 X6 X7 0.35    0.302   0.224  223    106   14.3    112   1.91  1.89  0.794  0.0661 
39 2 X7 X9 0.35    0.302   0.224  223    106   14.3    112   1.91  1.89  0.794  0.0661 
40 2 X3 X8 0.282   0.229   0.133  249    109   17.1    115   2.11  2.09  0.877  0.073  
41 2 X3 X9 0.265   0.21    0.107  255    110   17.8    116   2.16  2.14  0.899  0.0748 
42 2 X3 X6 0.227   0.17    0.0769 270    111   19.3    117   2.27  2.25  0.945  0.0786 
43 2 X8 X9 0.051   -0.0193  -0.168  337    118   25.3    123   2.79  2.76  1.16   0.0965 
44 2 X6 X8 0.0423  -0.0286  -0.138  340    118   25.5    123   2.82  2.79  1.17   0.0974 
45 2 X6 X9 0.0207  -0.0519  -0.162  348    119   26.2    124   2.88  2.85  1.2    0.0996 
46 3 X3 X4 X7 0.89    0.878   0.854  19.7  54.8 -32.7    61.8 0.348 0.341 0.143  0.012  
47 3 X3 X4 X8 0.881   0.867   0.837  23.4  57.4 -30.7    64.4 0.38  0.372 0.156  0.0131 
48 3 X2 X3 X4 0.88    0.866   0.848  23.8  57.6 -30.5    64.7 0.383 0.375 0.157  0.0132 
49 3 X1 X3 X4 0.879   0.865   0.845  24.2  57.9 -30.3    64.9 0.386 0.378 0.159  0.0133 
50 3 X3 X4 X5 0.876   0.862   0.836  25.1  58.5 -29.8    65.5 0.394 0.386 0.162  0.0136 
51 3 X3 X4 X6 0.874   0.86    0.834  25.8  58.9 -29.5    65.9 0.399 0.391 0.164  0.0138 
52 3 X1 X2 X4 0.874   0.86    0.841  25.8  59   -29.4    66   0.4   0.392 0.165  0.0138 
53 3 X1 X4 X8 0.874   0.859   0.832  26    59.1 -29.3    66.1 0.402 0.393 0.165  0.0139 
54 3 X3 X4 X9 0.873   0.859   0.829  26.2  59.2 -29.3    66.2 0.403 0.394 0.166  0.0139 
55 3 X1 X4 X5 0.873   0.858   0.836  26.2  59.2 -29.2    66.2 0.403 0.395 0.166  0.0139 
56 3 X1 X3 X7 0.872   0.857   0.831  26.7  59.5 -29      66.5 0.408 0.399 0.168  0.0141 
57 3 X1 X4 X6 0.871   0.856   0.829  27.1  59.8 -28.8    66.8 0.411 0.402 0.169  0.0142 
58 3 X1 X4 X9 0.871   0.856   0.829  27.2  59.8 -28.7    66.8 0.411 0.403 0.169  0.0142 
59 3 X1 X4 X7 0.869   0.854   0.829  27.7  60.1 -28.5    67.1 0.415 0.407 0.171  0.0144 
60 3 X4 X8 X9 0.862   0.846   0.808  30.5  61.8 -27.2    68.8 0.439 0.43  0.181  0.0152 
61 3 X3 X6 X7 0.855   0.838   0.809  33.1  63.2 -26      70.2 0.461 0.452 0.19   0.0159 
62 3 X3 X7 X8 0.845   0.828   0.8    36.8  65.1 -24.4    72.1 0.491 0.481 0.202  0.017  
63 3 X2 X3 X7 0.835   0.816   0.783  40.7  67.1 -22.8    74.1 0.524 0.513 0.216  0.0181 
64 3 X3 X5 X7 0.835   0.815   0.789  40.9  67.2 -22.7    74.2 0.526 0.515 0.216  0.0182 
65 3 X3 X7 X9 0.833   0.813   0.788  41.6  67.5 -22.4    74.5 0.532 0.521 0.219  0.0184 
66 3 X1 X2 X3 0.831   0.812   0.795  42.2  67.8 -22.2    74.8 0.537 0.526 0.221  0.0186 
67 3 X1 X3 X5 0.826   0.806   0.773  44.1  68.7 -21.4    75.7 0.553 0.542 0.228  0.0191 
68 3 X1 X3 X6 0.826   0.805   0.769  44.3  68.8 -21.4    75.8 0.554 0.543 0.228  0.0192 
69 3 X1 X3 X8 0.824   0.804   0.767  44.9  69.1 -21.1    76.1 0.56  0.548 0.23   0.0193 
70 3 X4 X7 X8 0.821   0.801   0.76   46    69.5 -20.7    76.5 0.568 0.557 0.234  0.0196 
71 3 X4 X7 X9 0.82    0.799   0.759  46.6  69.8 -20.5    76.8 0.574 0.562 0.236  0.0198 
72 3 X2 X4 X7 0.819   0.799   0.765  46.6  69.8 -20.5    76.8 0.574 0.562 0.236  0.0198 
73 3 X4 X6 X7 0.817   0.796   0.759  47.4  70.1 -20.2    77.1 0.58  0.568 0.239  0.0201 
74 3 X4 X5 X7 0.817   0.796   0.759  47.4  70.1 -20.2    77.1 0.58  0.568 0.239  0.0201 
75 3 X1 X3 X9 0.813   0.791   0.75   49.2  70.9 -19.5    77.9 0.595 0.583 0.245  0.0206 
76 3 X4 X5 X8 0.809   0.787   0.752  50.7  71.5 -19      78.6 0.608 0.596 0.25   0.021  
77 3 X2 X4 X9 0.808   0.786   0.748  50.8  71.6 -19      78.6 0.609 0.596 0.251  0.021  
78 3 X4 X5 X9 0.808   0.786   0.749  50.9  71.6 -18.9    78.6 0.61  0.597 0.251  0.0211 
79 3 X2 X4 X8 0.807   0.785   0.75   51.3  71.8 -18.8    78.8 0.613 0.6   0.252  0.0212 
80 3 X4 X6 X9 0.807   0.785   0.741  51.4  71.8 -18.7    78.8 0.614 0.601 0.253  0.0212 
81 3 X4 X6 X8 0.806   0.784   0.742  51.7  71.9 -18.6    79   0.616 0.604 0.254  0.0213 
82 3 X2 X4 X5 0.806   0.783   0.756  51.9  72   -18.6    79   0.618 0.605 0.254  0.0214 
83 3 X4 X5 X6 0.805   0.783   0.747  52.1  72.1 -18.5    79.1 0.62  0.607 0.255  0.0214 
84 3 X2 X4 X6 0.804   0.782   0.749  52.4  72.2 -18.4    79.2 0.622 0.609 0.256  0.0215 
85 3 X2 X3 X5 0.761   0.734   0.7    68.7  78.2 -13.2    85.2 0.758 0.743 0.312  0.0262 
86 3 X2 X3 X8 0.737   0.707   0.649  77.9  81.1 -10.7    88.1 0.835 0.818 0.344  0.0289 
87 3 X2 X3 X9 0.721   0.689   0.62   83.9  82.8 -9.09   89.8 0.885 0.867 0.364  0.0306 
88 3 X2 X3 X6 0.714   0.681   0.623  86.6  83.6 -8.41   90.6 0.908 0.889 0.374  0.0314 
89 3 X2 X5 X8 0.665   0.627   0.594  105    88.3 -4.12   95.3 1.06  1.04  0.437  0.0367 
90 3 X2 X5 X6 0.66    0.621   0.575  107    88.7 -3.72   95.7 1.08  1.06  0.444  0.0373 
91 3 X1 X2 X5 0.647   0.606   0.579  112    89.9 -2.66   96.9 1.12  1.1   0.461  0.0387 
92 3 X2 X5 X9 0.647   0.606   0.557  112    89.9 -2.62   96.9 1.12  1.1   0.462  0.0388 
93 3 X2 X5 X7 0.646   0.605   0.551  112    90   -2.58   97   1.12  1.1   0.463  0.0389 
94 3 X5 X6 X7 0.62    0.577   0.516  122    92.1 -0.645  99.1 1.21  1.18  0.496  0.0417 
95 3 X1 X5 X8 0.618   0.573   0.525  123    92.3 -0.431  99.3 1.22  1.19  0.5    0.042  
96 3 X1 X5 X6 0.617   0.573   0.518  124    92.4 -0.378  99.4 1.22  1.19  0.501  0.0421 
97 3 X5 X7 X8 0.616   0.572   0.531  124    92.4 -0.348  99.4 1.22  1.19  0.502  0.0421 
98 3 X1 X5 X7 0.611   0.566   0.518  126    92.8 0.016  99.8 1.24  1.21  0.508  0.0427 
99 3 X1 X5 X9 0.604   0.559   0.499  128    93.3 0.508  100   1.26  1.23  0.517  0.0435 
100 3 X3 X5 X8 0.601   0.555   0.503  130    93.6 0.754  101   1.27  1.24  0.522  0.0438 
101 3 X5 X7 X9 0.6     0.554   0.502  130    93.6 0.804  101   1.27  1.24  0.523  0.0439 
102 3 X5 X8 X9 0.598   0.551   0.488  131    93.8 0.981  101   1.28  1.25  0.526  0.0442 
103 3 X3 X5 X6 0.588   0.54    0.466  135    94.6 1.65   102   1.31  1.28  0.539  0.0453 
104 3 X1 X8 X9 0.582   0.534   0.44   137    95   2.04   102   1.33  1.3   0.547  0.0459 
105 3 X5 X6 X8 0.578   0.529   0.465  138    95.3 2.32   102   1.34  1.31  0.552  0.0464 
106 3 X3 X5 X9 0.573   0.524   0.461  140    95.6 2.64   103   1.36  1.33  0.558  0.0469 
107 3 X1 X2 X7 0.552   0.5     0.444  148    97.1 4      104   1.43  1.4   0.586  0.0493 
108 3 X1 X2 X8 0.55    0.498   0.456  149    97.2 4.08   104   1.43  1.4   0.588  0.0494 
109 3 X5 X6 X9 0.547   0.495   0.415  150    97.4 4.28   104   1.44  1.41  0.592  0.0498 
110 3 X1 X2 X6 0.544   0.491   0.441  151    97.6 4.49   105   1.45  1.42  0.597  0.0501 
111 3 X1 X2 X9 0.544   0.491   0.43   151    97.6 4.49   105   1.45  1.42  0.597  0.0501 
112 3 X1 X6 X8 0.534   0.48    0.401  155    98.3 5.09   105   1.48  1.45  0.61   0.0512 
113 3 X1 X7 X8 0.533   0.479   0.409  156    98.3 5.14   105   1.48  1.45  0.611  0.0513 
114 3 X1 X6 X9 0.529   0.475   0.376  157    98.6 5.39   106   1.5   1.47  0.616  0.0517 
115 3 X1 X7 X9 0.528   0.473   0.389  157    98.6 5.44   106   1.5   1.47  0.617  0.0518 
116 3 X1 X6 X7 0.528   0.473   0.394  157    98.6 5.45   106   1.5   1.47  0.618  0.0519 
117 3 X2 X8 X9 0.476   0.415   0.295  177    102   8.4    109   1.67  1.63  0.686  0.0576 
118 3 X2 X7 X8 0.455   0.392   0.3    185    103   9.52   110   1.73  1.7   0.713  0.0599 
119 3 X2 X6 X8 0.451   0.388   0.295  187    103   9.7    110   1.74  1.71  0.718  0.0603 
120 3 X2 X7 X9 0.44    0.376   0.271  191    104   10.3    111   1.78  1.74  0.732  0.0615 
121 3 X2 X6 X7 0.44    0.375   0.27   191    104   10.3    111   1.78  1.74  0.733  0.0616 
122 3 X2 X6 X9 0.435   0.37    0.259  193    104   10.5    111   1.79  1.76  0.738  0.062  
123 3 X7 X8 X9 0.408   0.34    0.216  203    105   11.9    112   1.88  1.84  0.774  0.065  
124 3 X6 X7 X8 0.361   0.288   0.199  221    108   14      115   2.03  1.99  0.835  0.0701 
125 3 X6 X7 X9 0.35    0.276   0.168  225    108   14.5    115   2.06  2.02  0.849  0.0713 
126 3 X3 X6 X8 0.283   0.201   0.0775 250    111   17.3    118   2.28  2.23  0.937  0.0787 
127 3 X3 X8 X9 0.283   0.2     0.0776 250    111   17.3    118   2.28  2.23  0.937  0.0787 
128 3 X3 X6 X9 0.265   0.18    0.047  257    112   18      119   2.34  2.29  0.961  0.0807 
129 3 X6 X8 X9 0.0515  -0.0579  -0.242  338    120   25.4    127   3.01  2.95  1.24   0.104  
130 4 X1 X4 X8 X9 0.921   0.908   0.886  10.1  47   -38.1    55.4 0.272 0.264 0.111  0.00941
131 4 X3 X4 X8 X9 0.916   0.902   0.879  12.1  49   -36.8    57.4 0.291 0.281 0.118  0.01   
132 4 X3 X4 X7 X8 0.898   0.882   0.85   18.6  54.5 -32.9    62.9 0.35  0.338 0.142  0.0121 
133 4 X3 X4 X6 X7 0.897   0.881   0.854  19    54.8 -32.7    63.2 0.353 0.342 0.144  0.0122 
134 4 X3 X4 X5 X7 0.894   0.877   0.848  20.3  55.9 -32      64.3 0.365 0.354 0.149  0.0126 
135 4 X1 X3 X6 X7 0.893   0.876   0.845  20.6  56   -31.8    64.5 0.368 0.356 0.15   0.0127 
136 4 X1 X3 X4 X7 0.892   0.875   0.851  20.9  56.3 -31.7    64.7 0.37  0.359 0.151  0.0128 
137 4 X2 X3 X4 X7 0.891   0.873   0.847  21.6  56.8 -31.3    65.2 0.377 0.364 0.153  0.013  
138 4 X3 X4 X7 X9 0.89    0.873   0.842  21.7  56.8 -31.3    65.2 0.377 0.365 0.153  0.013  
139 4 X2 X3 X4 X8 0.888   0.87    0.843  22.5  57.4 -30.8    65.8 0.385 0.372 0.156  0.0133 
140 4 X1 X2 X3 X4 0.887   0.868   0.85   23.1  57.8 -30.5    66.2 0.39  0.378 0.159  0.0135 
141 4 X1 X3 X4 X8 0.885   0.867   0.839  23.7  58.2 -30.2    66.6 0.395 0.383 0.161  0.0137 
142 4 X1 X3 X5 X7 0.883   0.864   0.833  24.6  58.8 -29.8    67.3 0.404 0.391 0.164  0.014  
143 4 X1 X2 X4 X5 0.883   0.864   0.845  24.6  58.9 -29.8    67.3 0.404 0.391 0.164  0.014  
144 4 X3 X4 X5 X8 0.882   0.864   0.828  24.7  58.9 -29.7    67.3 0.405 0.392 0.165  0.014  
145 4 X1 X3 X7 X8 0.882   0.864   0.831  24.7  58.9 -29.7    67.3 0.405 0.392 0.165  0.014  
146 4 X3 X4 X6 X8 0.882   0.863   0.826  24.7  59   -29.7    67.4 0.405 0.392 0.165  0.014  
147 4 X2 X3 X4 X6 0.881   0.862   0.838  25.1  59.2 -29.5    67.6 0.408 0.395 0.166  0.0141 
148 4 X2 X3 X4 X5 0.881   0.862   0.838  25.1  59.2 -29.5    67.6 0.408 0.395 0.166  0.0141 
149 4 X2 X3 X4 X9 0.88    0.86    0.831  25.8  59.6 -29.2    68.1 0.415 0.401 0.169  0.0143 
150 4 X1 X2 X4 X8 0.879   0.86    0.836  25.8  59.7 -29.1    68.1 0.415 0.402 0.169  0.0143 
151 4 X1 X4 X5 X8 0.879   0.86    0.832  25.9  59.7 -29.1    68.1 0.415 0.402 0.169  0.0144 
152 4 X1 X3 X4 X9 0.879   0.86    0.833  26    59.8 -29.1    68.2 0.417 0.403 0.169  0.0144 
153 4 X1 X3 X4 X5 0.879   0.859   0.833  26.2  59.9 -29      68.3 0.418 0.405 0.17   0.0145 
154 4 X1 X3 X4 X6 0.879   0.859   0.829  26.2  59.9 -29      68.3 0.418 0.405 0.17   0.0145 
155 4 X3 X4 X5 X6 0.876   0.856   0.824  27    60.5 -28.6    68.9 0.426 0.412 0.173  0.0147 
156 4 X1 X2 X4 X6 0.876   0.856   0.831  27    60.5 -28.6    68.9 0.426 0.412 0.173  0.0147 
157 4 X3 X4 X5 X9 0.876   0.856   0.821  27.1  60.5 -28.5    68.9 0.426 0.413 0.173  0.0147 
158 4 X1 X2 X4 X9 0.875   0.855   0.828  27.4  60.7 -28.4    69.1 0.429 0.416 0.175  0.0148 
159 4 X1 X4 X6 X8 0.875   0.855   0.819  27.4  60.7 -28.4    69.1 0.43  0.416 0.175  0.0148 
160 4 X1 X4 X5 X7 0.875   0.855   0.827  27.5  60.7 -28.3    69.1 0.43  0.416 0.175  0.0149 
161 4 X1 X2 X4 X7 0.874   0.854   0.829  27.7  60.9 -28.2    69.3 0.432 0.418 0.176  0.0149 
162 4 X3 X4 X6 X9 0.874   0.854   0.819  27.7  60.9 -28.2    69.3 0.432 0.418 0.176  0.0149 
163 4 X1 X4 X5 X9 0.874   0.854   0.826  27.8  61   -28.2    69.4 0.433 0.419 0.176  0.015  
164 4 X1 X4 X7 X8 0.874   0.854   0.82   27.9  61   -28.1    69.4 0.434 0.42  0.177  0.015  
165 4 X1 X4 X5 X6 0.874   0.853   0.824  28    61.1 -28.1    69.5 0.435 0.421 0.177  0.015  
166 4 X1 X4 X6 X9 0.872   0.852   0.819  28.5  61.4 -27.8    69.8 0.44  0.425 0.179  0.0152 
167 4 X1 X3 X7 X9 0.872   0.852   0.818  28.6  61.5 -27.8    69.9 0.441 0.426 0.179  0.0152 
168 4 X1 X2 X3 X7 0.872   0.851   0.822  28.7  61.5 -27.7    69.9 0.441 0.427 0.18   0.0153 
169 4 X1 X4 X6 X7 0.871   0.851   0.819  29    61.7 -27.6    70.1 0.444 0.429 0.18   0.0153 
170 4 X1 X4 X7 X9 0.871   0.85    0.818  29    61.7 -27.6    70.1 0.444 0.429 0.18   0.0153 
171 4 X3 X6 X7 X8 0.871   0.85    0.814  29.1  61.8 -27.6    70.2 0.445 0.431 0.181  0.0154 
172 4 X4 X7 X8 X9 0.871   0.85    0.811  29.2  61.8 -27.5    70.2 0.445 0.431 0.181  0.0154 
173 4 X3 X7 X8 X9 0.869   0.848   0.817  29.7  62.1 -27.3    70.5 0.45  0.436 0.183  0.0156 
174 4 X4 X5 X8 X9 0.866   0.844   0.801  31    62.9 -26.7    71.3 0.462 0.447 0.188  0.016  
175 4 X2 X4 X8 X9 0.864   0.842   0.8    31.7  63.3 -26.4    71.7 0.469 0.454 0.191  0.0162 
176 4 X4 X6 X8 X9 0.864   0.842   0.796  31.8  63.4 -26.3    71.8 0.469 0.454 0.191  0.0162 
177 4 X3 X5 X6 X7 0.861   0.839   0.802  32.8  63.9 -25.9    72.3 0.478 0.463 0.194  0.0165 
178 4 X2 X3 X6 X7 0.856   0.833   0.8    34.6  64.9 -25.1    73.3 0.494 0.479 0.201  0.0171 
179 4 X3 X6 X7 X9 0.856   0.833   0.797  34.9  65.1 -25      73.5 0.497 0.481 0.202  0.0172 
180 4 X2 X3 X7 X8 0.85    0.826   0.784  37.1  66.3 -24      74.7 0.517 0.501 0.21   0.0179 
181 4 X3 X5 X7 X8 0.849   0.825   0.793  37.4  66.5 -23.9    74.9 0.52  0.504 0.212  0.018  
182 4 X1 X3 X8 X9 0.847   0.822   0.788  38.3  66.9 -23.5    75.3 0.528 0.511 0.215  0.0183 
183 4 X1 X2 X3 X8 0.845   0.82    0.792  39    67.3 -23.3    75.7 0.535 0.518 0.217  0.0185 
184 4 X1 X2 X3 X6 0.843   0.818   0.786  39.6  67.6 -23      76   0.54  0.522 0.22   0.0187 
185 4 X1 X3 X6 X8 0.839   0.813   0.763  41.2  68.3 -22.4    76.8 0.554 0.536 0.225  0.0192 
186 4 X2 X3 X5 X7 0.839   0.813   0.773  41.4  68.4 -22.3    76.9 0.556 0.538 0.226  0.0192 
187 4 X1 X2 X3 X5 0.837   0.811   0.784  42    68.8 -22.1    77.2 0.562 0.544 0.228  0.0194 
188 4 X2 X3 X7 X9 0.837   0.81    0.771  42.1  68.8 -22      77.2 0.563 0.544 0.229  0.0194 
189 4 X3 X5 X7 X9 0.835   0.809   0.779  42.5  69   -21.9    77.4 0.567 0.548 0.23   0.0196 
190 4 X1 X3 X5 X8 0.835   0.808   0.761  42.7  69.1 -21.8    77.5 0.568 0.55  0.231  0.0196 
191 4 X1 X3 X5 X6 0.834   0.808   0.764  43    69.2 -21.7    77.6 0.57  0.552 0.232  0.0197 
192 4 X1 X2 X3 X9 0.833   0.806   0.771  43.6  69.5 -21.4    77.9 0.577 0.558 0.234  0.0199 
193 4 X1 X3 X5 X9 0.826   0.799   0.744  45.9  70.6 -20.6    79   0.597 0.578 0.243  0.0206 
194 4 X1 X3 X6 X9 0.826   0.798   0.743  46.1  70.7 -20.5    79.1 0.599 0.579 0.243  0.0207 
195 4 X2 X4 X7 X8 0.824   0.796   0.748  46.9  71   -20.2    79.4 0.606 0.587 0.246  0.0209 
196 4 X4 X5 X7 X8 0.822   0.793   0.745  47.8  71.4 -19.9    79.8 0.614 0.594 0.25   0.0212 
197 4 X4 X6 X7 X8 0.821   0.793   0.742  47.9  71.5 -19.9    79.9 0.615 0.595 0.25   0.0213 
198 4 X2 X4 X7 X9 0.821   0.792   0.742  48    71.5 -19.8    79.9 0.616 0.596 0.25   0.0213 
199 4 X2 X4 X5 X7 0.82    0.791   0.747  48.5  71.7 -19.7    80.1 0.62  0.6   0.252  0.0214 
200 4 X4 X6 X7 X9 0.82    0.791   0.742  48.5  71.8 -19.6    80.2 0.621 0.601 0.252  0.0215 
201 4 X2 X4 X6 X7 0.82    0.791   0.747  48.5  71.8 -19.6    80.2 0.621 0.601 0.252  0.0215 
202 4 X4 X5 X7 X9 0.82    0.791   0.743  48.5  71.8 -19.6    80.2 0.621 0.601 0.253  0.0215 
203 4 X4 X5 X6 X7 0.818   0.788   0.741  49.4  72.1 -19.3    80.5 0.628 0.608 0.255  0.0217 
204 4 X2 X4 X5 X9 0.809   0.779   0.738  52.5  73.5 -18.2    81.9 0.657 0.636 0.267  0.0227 
205 4 X4 X5 X6 X8 0.809   0.779   0.734  52.5  73.5 -18.2    81.9 0.657 0.636 0.267  0.0227 
206 4 X2 X4 X5 X8 0.809   0.779   0.743  52.6  73.5 -18.2    81.9 0.657 0.636 0.267  0.0227 
207 4 X4 X5 X6 X9 0.808   0.778   0.73   52.8  73.6 -18.1    82   0.66  0.638 0.268  0.0228 
208 4 X2 X4 X6 X9 0.808   0.778   0.731  52.8  73.6 -18.1    82   0.66  0.638 0.268  0.0228 
209 4 X2 X4 X6 X8 0.807   0.776   0.733  53.3  73.8 -18      82.2 0.664 0.642 0.27   0.0229 
210 4 X2 X4 X5 X6 0.806   0.775   0.738  53.9  74   -17.8    82.4 0.669 0.648 0.272  0.0231 
211 4 X2 X3 X5 X8 0.788   0.754   0.711  60.7  76.7 -15.5    85.1 0.731 0.708 0.297  0.0253 
212 4 X2 X3 X5 X6 0.777   0.742   0.689  64.7  78.1 -14.3    86.5 0.767 0.742 0.312  0.0265 
213 4 X2 X3 X5 X9 0.768   0.731   0.675  68.2  79.3 -13.3    87.7 0.799 0.773 0.325  0.0276 
214 4 X2 X3 X8 X9 0.744   0.703   0.639  77.2  82.2 -10.8    90.7 0.881 0.852 0.358  0.0304 
215 4 X2 X3 X6 X8 0.744   0.703   0.628  77.3  82.3 -10.7    90.7 0.881 0.853 0.358  0.0305 
216 4 X2 X3 X6 X9 0.726   0.682   0.594  84    84.3 -8.98   92.7 0.943 0.912 0.383  0.0326 
217 4 X2 X5 X8 X9 0.712   0.666   0.613  89.3  85.8 -7.69   94.2 0.99  0.958 0.403  0.0342 
218 4 X2 X5 X6 X8 0.682   0.632   0.58   101    88.7 -5.04   97.1 1.09  1.06  0.445  0.0378 
219 4 X5 X7 X8 X9 0.679   0.628   0.561  102    89.1 -4.76   97.5 1.11  1.07  0.449  0.0382 
220 4 X1 X5 X8 X9 0.672   0.62    0.548  105    89.7 -4.2    98.1 1.13  1.09  0.459  0.039  
221 4 X2 X5 X7 X8 0.666   0.612   0.549  107    90.3 -3.67   98.7 1.15  1.11  0.468  0.0398 
222 4 X1 X2 X5 X8 0.665   0.612   0.575  107    90.3 -3.65   98.7 1.15  1.11  0.468  0.0398 
223 4 X1 X2 X5 X6 0.661   0.607   0.562  109    90.7 -3.29   99.1 1.17  1.13  0.475  0.0403 
224 4 X2 X5 X6 X9 0.661   0.607   0.536  109    90.7 -3.28   99.1 1.17  1.13  0.475  0.0404 
225 4 X2 X5 X6 X7 0.661   0.606   0.527  109    90.7 -3.27   99.1 1.17  1.13  0.475  0.0404 
226 4 X1 X2 X5 X9 0.647   0.591   0.541  114    91.9 -2.23   100   1.21  1.17  0.494  0.0419 
227 4 X1 X2 X5 X7 0.647   0.591   0.537  114    91.9 -2.23   100   1.21  1.17  0.494  0.042  
228 4 X2 X5 X7 X9 0.647   0.59    0.515  114    91.9 -2.17   100   1.22  1.18  0.495  0.042  
229 4 X5 X6 X8 X9 0.641   0.583   0.484  116    92.4 -1.73   101   1.24  1.2   0.503  0.0427 
230 4 X5 X6 X7 X8 0.639   0.582   0.515  117    92.6 -1.61   101   1.24  1.2   0.505  0.0429 
231 4 X1 X5 X6 X8 0.633   0.574   0.507  120    93.1 -1.12   102   1.26  1.22  0.514  0.0437 
232 4 X3 X5 X6 X8 0.629   0.57    0.49   121    93.4 -0.872  102   1.28  1.23  0.519  0.0441 
233 4 X3 X5 X8 X9 0.628   0.569   0.513  121    93.5 -0.801  102   1.28  1.24  0.52   0.0442 
234 4 X1 X5 X6 X7 0.627   0.567   0.497  122    93.6 -0.697  102   1.28  1.24  0.522  0.0444 
235 4 X1 X5 X7 X8 0.624   0.564   0.502  123    93.8 -0.52   102   1.29  1.25  0.526  0.0447 
236 4 X5 X6 X7 X9 0.621   0.56    0.474  124    94.1 -0.235  102   1.31  1.26  0.531  0.0452 
237 4 X1 X5 X6 X9 0.617   0.556   0.472  126    94.4 0.0274 103   1.32  1.28  0.536  0.0456 
238 4 X1 X5 X7 X9 0.611   0.549   0.477  128    94.8 0.41   103   1.34  1.29  0.544  0.0462 
239 4 X3 X5 X6 X9 0.596   0.531   0.439  133    95.9 1.47   104   1.39  1.35  0.565  0.0481 
240 4 X1 X2 X8 X9 0.595   0.531   0.451  134    96   1.52   104   1.39  1.35  0.567  0.0482 
241 4 X1 X6 X8 X9 0.588   0.522   0.399  137    96.5 2.02   105   1.42  1.37  0.577  0.049  
242 4 X1 X7 X8 X9 0.582   0.515   0.407  139    97   2.41   105   1.44  1.39  0.585  0.0497 
243 4 X1 X2 X7 X8 0.56    0.49    0.419  147    98.5 3.82   107   1.51  1.47  0.616  0.0523 
244 4 X1 X2 X6 X7 0.552   0.48    0.392  150    99.1 4.33   107   1.54  1.49  0.627  0.0533 
245 4 X1 X2 X7 X9 0.552   0.48    0.392  150    99.1 4.35   107   1.54  1.49  0.628  0.0534 
246 4 X1 X2 X6 X8 0.551   0.479   0.418  151    99.1 4.38   108   1.55  1.5   0.628  0.0534 
247 4 X1 X2 X6 X9 0.544   0.471   0.387  153    99.6 4.8    108   1.57  1.52  0.638  0.0542 
248 4 X1 X6 X7 X8 0.534   0.46    0.363  157    100   5.41   109   1.6   1.55  0.652  0.0554 
249 4 X1 X6 X7 X9 0.529   0.454   0.336  159    101   5.71   109   1.62  1.57  0.659  0.056  
250 4 X2 X7 X8 X9 0.485   0.403   0.269  176    103   8.18   112   1.77  1.71  0.72   0.0612 
251 4 X2 X6 X8 X9 0.476   0.393   0.251  179    104   8.66   112   1.8   1.74  0.733  0.0623 
252 4 X2 X6 X7 X8 0.455   0.367   0.25   187    105   9.8    113   1.88  1.82  0.763  0.0649 
253 4 X2 X6 X7 X9 0.44    0.351   0.216  193    106   10.5    114   1.93  1.86  0.784  0.0666 
254 4 X6 X7 X8 X9 0.412   0.318   0.158  203    107   11.9    116   2.02  1.96  0.823  0.07   
255 4 X3 X6 X8 X9 0.285   0.17    0.0102 252    113   17.5    121   2.46  2.38  1      0.0851 
256 5 X3 X4 X7 X8 X9 0.932   0.918   0.892  7.85 44.5 -38.8    54.3 0.255 0.243 0.102  0.0088 
257 5 X1 X4 X5 X8 X9 0.928   0.913   0.893  9.26 46.1 -37.8    55.9 0.268 0.256 0.108  0.00928
258 5 X1 X3 X4 X8 X9 0.925   0.909   0.886  10.7  47.6 -37      57.4 0.282 0.269 0.113  0.00976
259 5 X1 X2 X4 X8 X9 0.924   0.909   0.886  10.7  47.7 -36.9    57.5 0.283 0.27  0.113  0.00978
260 5 X1 X4 X7 X8 X9 0.923   0.906   0.879  11.4  48.4 -36.5    58.2 0.29  0.277 0.116  0.01   
261 5 X1 X4 X6 X8 X9 0.921   0.904   0.877  12.1  49   -36.1    58.8 0.296 0.282 0.119  0.0102 
262 5 X3 X4 X6 X8 X9 0.921   0.904   0.877  12.2  49.2 -36      59   0.297 0.284 0.119  0.0103 
263 5 X2 X3 X4 X8 X9 0.919   0.903   0.879  12.6  49.6 -35.8    59.4 0.301 0.287 0.121  0.0104 
264 5 X3 X4 X5 X8 X9 0.916   0.898   0.868  13.9  50.8 -35      60.7 0.314 0.3   0.126  0.0109 
265 5 X1 X3 X7 X8 X9 0.909   0.89    0.857  16.6  53.2 -33.5    63.1 0.341 0.325 0.137  0.0118 
266 5 X3 X4 X6 X7 X8 0.907   0.888   0.851  17.3  53.8 -33.1    63.6 0.347 0.331 0.139  0.012  
267 5 X1 X3 X6 X7 X8 0.906   0.886   0.848  17.8  54.3 -32.8    64.1 0.353 0.337 0.141  0.0122 
268 5 X3 X6 X7 X8 X9 0.905   0.885   0.856  18.1  54.5 -32.7    64.3 0.355 0.339 0.142  0.0123 
269 5 X3 X4 X5 X7 X8 0.901   0.88    0.842  19.6  55.8 -31.9    65.6 0.371 0.354 0.149  0.0128 
270 5 X1 X3 X4 X7 X8 0.9     0.879   0.846  20    56.1 -31.7    65.9 0.374 0.357 0.15   0.0129 
271 5 X2 X3 X4 X7 X8 0.899   0.878   0.842  20.4  56.4 -31.5    66.2 0.378 0.36  0.151  0.0131 
272 5 X3 X4 X5 X6 X7 0.899   0.877   0.843  20.6  56.5 -31.4    66.3 0.38  0.362 0.152  0.0131 
273 5 X1 X3 X5 X6 X7 0.898   0.877   0.841  20.6  56.6 -31.3    66.4 0.381 0.363 0.153  0.0132 
274 5 X2 X3 X4 X6 X7 0.898   0.876   0.848  20.9  56.8 -31.2    66.6 0.383 0.366 0.154  0.0133 
275 5 X1 X3 X4 X6 X7 0.897   0.876   0.843  21    56.8 -31.2    66.6 0.384 0.366 0.154  0.0133 
276 5 X3 X4 X6 X7 X9 0.897   0.876   0.842  21    56.8 -31.2    66.6 0.384 0.366 0.154  0.0133 
277 5 X1 X2 X3 X4 X8 0.894   0.873   0.846  22.1  57.7 -30.6    67.5 0.395 0.377 0.158  0.0136 
278 5 X3 X4 X5 X7 X9 0.894   0.872   0.835  22.3  57.8 -30.5    67.6 0.397 0.379 0.159  0.0137 
279 5 X1 X3 X4 X5 X7 0.894   0.872   0.842  22.3  57.8 -30.5    67.7 0.397 0.379 0.159  0.0137 
280 5 X2 X3 X4 X5 X7 0.894   0.872   0.841  22.3  57.8 -30.5    67.7 0.397 0.379 0.159  0.0137 
281 5 X1 X2 X3 X6 X7 0.893   0.871   0.84   22.5  58   -30.4    67.8 0.399 0.38  0.16   0.0138 
282 5 X1 X3 X6 X7 X9 0.893   0.871   0.832  22.5  58   -30.4    67.8 0.399 0.381 0.16   0.0138 
283 5 X1 X2 X3 X4 X7 0.893   0.871   0.841  22.6  58   -30.4    67.8 0.399 0.381 0.16   0.0138 
284 5 X1 X3 X4 X7 X9 0.893   0.87    0.84   22.8  58.2 -30.2    68   0.402 0.384 0.161  0.0139 
285 5 X1 X3 X5 X7 X8 0.891   0.868   0.828  23.4  58.7 -29.9    68.5 0.408 0.389 0.164  0.0141 
286 5 X2 X3 X4 X6 X8 0.891   0.868   0.832  23.5  58.7 -29.9    68.5 0.408 0.389 0.164  0.0141 
287 5 X1 X2 X4 X5 X8 0.891   0.868   0.844  23.5  58.7 -29.9    68.5 0.408 0.389 0.164  0.0141 
288 5 X2 X3 X4 X7 X9 0.891   0.868   0.833  23.6  58.8 -29.9    68.6 0.409 0.39  0.164  0.0141 
289 5 X2 X3 X4 X5 X8 0.889   0.866   0.832  24.1  59.1 -29.6    68.9 0.414 0.395 0.166  0.0143 
290 5 X1 X2 X3 X4 X5 0.887   0.863   0.839  24.9  59.7 -29.2    69.5 0.423 0.403 0.169  0.0146 
291 5 X1 X2 X3 X4 X9 0.887   0.863   0.835  25.1  59.8 -29.1    69.6 0.424 0.404 0.17   0.0147 
292 5 X1 X2 X3 X4 X6 0.887   0.863   0.834  25.1  59.8 -29.1    69.6 0.424 0.405 0.17   0.0147 
293 5 X1 X3 X4 X6 X8 0.885   0.861   0.821  25.6  60.2 -28.9    70   0.429 0.41  0.172  0.0148 
294 5 X1 X3 X4 X5 X8 0.885   0.861   0.826  25.7  60.2 -28.9    70   0.43  0.41  0.172  0.0149 
295 5 X1 X2 X3 X5 X7 0.884   0.86    0.827  26    60.5 -28.7    70.3 0.433 0.413 0.174  0.015  
296 5 X3 X4 X5 X6 X8 0.883   0.859   0.815  26.4  60.7 -28.5    70.5 0.437 0.417 0.175  0.0151 
297 5 X1 X2 X4 X5 X9 0.883   0.859   0.832  26.4  60.7 -28.5    70.6 0.437 0.417 0.175  0.0151 
298 5 X1 X3 X5 X7 X9 0.883   0.858   0.816  26.5  60.8 -28.5    70.6 0.438 0.418 0.176  0.0151 
299 5 X1 X2 X4 X5 X6 0.883   0.858   0.831  26.5  60.8 -28.4    70.6 0.438 0.418 0.176  0.0151 
300 5 X1 X2 X4 X5 X7 0.883   0.858   0.831  26.6  60.9 -28.4    70.7 0.439 0.419 0.176  0.0152 
301 5 X1 X2 X3 X7 X8 0.883   0.858   0.821  26.6  60.9 -28.4    70.7 0.44  0.419 0.176  0.0152 
302 5 X2 X3 X4 X5 X6 0.882   0.858   0.827  26.7  61   -28.3    70.8 0.441 0.42  0.177  0.0152 
303 5 X2 X3 X4 X6 X9 0.881   0.857   0.819  27.1  61.2 -28.2    71   0.444 0.423 0.178  0.0153 
304 5 X2 X3 X4 X5 X9 0.881   0.857   0.82   27.1  61.2 -28.2    71   0.444 0.423 0.178  0.0153 
305 5 X1 X4 X5 X7 X8 0.881   0.857   0.821  27.1  61.2 -28.2    71   0.444 0.423 0.178  0.0153 
306 5 X1 X2 X4 X6 X8 0.881   0.857   0.823  27.1  61.2 -28.2    71   0.444 0.424 0.178  0.0154 
307 5 X1 X2 X4 X7 X8 0.88    0.855   0.819  27.6  61.5 -28      71.3 0.449 0.428 0.18   0.0155 
308 5 X1 X4 X5 X6 X8 0.88    0.854   0.816  27.8  61.7 -27.9    71.5 0.451 0.43  0.181  0.0156 
309 5 X1 X3 X4 X5 X9 0.879   0.854   0.819  28    61.8 -27.8    71.6 0.453 0.432 0.182  0.0157 
310 5 X1 X3 X4 X6 X9 0.879   0.854   0.815  28    61.8 -27.8    71.6 0.453 0.432 0.182  0.0157 
311 5 X1 X3 X4 X5 X6 0.879   0.853   0.817  28.2  61.9 -27.7    71.7 0.454 0.433 0.182  0.0157 
312 5 X3 X5 X6 X7 X8 0.878   0.853   0.811  28.3  62   -27.6    71.8 0.456 0.435 0.183  0.0158 
313 5 X1 X2 X4 X6 X9 0.877   0.852   0.818  28.6  62.2 -27.5    72   0.459 0.438 0.184  0.0159 
314 5 X1 X2 X4 X6 X7 0.877   0.851   0.82   28.8  62.3 -27.4    72.1 0.461 0.44  0.185  0.0159 
315 5 X3 X5 X7 X8 X9 0.877   0.851   0.82   28.9  62.4 -27.3    72.2 0.462 0.44  0.185  0.016  
316 5 X3 X4 X5 X6 X9 0.876   0.851   0.807  29    62.4 -27.3    72.2 0.462 0.441 0.185  0.016  
317 5 X1 X4 X5 X7 X9 0.876   0.851   0.817  29    62.4 -27.3    72.2 0.462 0.441 0.185  0.016  
318 5 X1 X2 X4 X7 X9 0.875   0.85    0.812  29.3  62.6 -27.1    72.5 0.466 0.444 0.187  0.0161 
319 5 X1 X4 X6 X7 X8 0.875   0.85    0.806  29.3  62.6 -27.1    72.5 0.466 0.444 0.187  0.0161 
320 5 X1 X4 X5 X6 X7 0.875   0.849   0.813  29.4  62.7 -27.1    72.5 0.467 0.445 0.187  0.0161 
321 5 X1 X4 X5 X6 X9 0.875   0.849   0.813  29.6  62.8 -27      72.6 0.469 0.447 0.188  0.0162 
322 5 X1 X4 X6 X7 X9 0.873   0.846   0.808  30.3  63.3 -26.7    73.1 0.476 0.454 0.191  0.0164 
323 5 X2 X3 X6 X7 X8 0.873   0.846   0.804  30.3  63.3 -26.7    73.1 0.476 0.454 0.191  0.0165 
324 5 X4 X5 X7 X8 X9 0.872   0.846   0.799  30.5  63.4 -26.6    73.2 0.477 0.455 0.191  0.0165 
325 5 X1 X2 X3 X7 X9 0.872   0.845   0.807  30.6  63.5 -26.6    73.3 0.479 0.457 0.192  0.0166 
326 5 X4 X6 X7 X8 X9 0.872   0.845   0.797  30.8  63.6 -26.5    73.4 0.48  0.458 0.193  0.0166 
327 5 X2 X3 X7 X8 X9 0.872   0.845   0.798  30.8  63.6 -26.5    73.4 0.481 0.459 0.193  0.0166 
328 5 X4 X5 X6 X8 X9 0.871   0.844   0.797  30.9  63.6 -26.4    73.5 0.482 0.46  0.193  0.0166 
329 5 X2 X4 X7 X8 X9 0.871   0.844   0.795  31    63.7 -26.4    73.5 0.483 0.461 0.194  0.0167 
330 5 X1 X3 X6 X8 X9 0.869   0.842   0.798  31.7  64.1 -26.1    73.9 0.49  0.467 0.196  0.0169 
331 5 X2 X4 X5 X8 X9 0.867   0.839   0.792  32.7  64.7 -25.7    74.5 0.499 0.476 0.2    0.0172 
332 5 X2 X4 X6 X8 X9 0.866   0.838   0.788  33.1  64.9 -25.5    74.7 0.503 0.48  0.202  0.0174 
333 5 X1 X2 X3 X8 X9 0.864   0.836   0.809  33.7  65.3 -25.2    75.1 0.509 0.486 0.204  0.0176 
334 5 X2 X3 X5 X6 X7 0.863   0.834   0.794  34.1  65.5 -25.1    75.3 0.513 0.489 0.206  0.0177 
335 5 X3 X5 X6 X7 X9 0.862   0.833   0.79   34.6  65.8 -24.9    75.6 0.518 0.494 0.207  0.0179 
336 5 X1 X2 X3 X6 X8 0.858   0.829   0.784  35.8  66.5 -24.3    76.3 0.53  0.505 0.212  0.0183 
337 5 X2 X3 X6 X7 X9 0.857   0.828   0.787  36.2  66.7 -24.2    76.5 0.534 0.509 0.214  0.0184 
338 5 X2 X3 X5 X7 X8 0.854   0.823   0.776  37.5  67.4 -23.6    77.3 0.547 0.522 0.219  0.0189 
339 5 X1 X3 X5 X8 X9 0.853   0.822   0.773  38    67.7 -23.5    77.5 0.551 0.526 0.221  0.019  
340 5 X1 X2 X3 X5 X8 0.848   0.816   0.778  39.7  68.6 -22.8    78.4 0.568 0.542 0.228  0.0196 
341 5 X1 X2 X3 X5 X6 0.846   0.814   0.774  40.6  69   -22.4    78.8 0.577 0.55  0.231  0.0199 
342 5 X1 X3 X5 X6 X8 0.845   0.813   0.753  40.9  69.2 -22.3    79   0.58  0.553 0.232  0.02   
343 5 X1 X2 X3 X6 X9 0.844   0.812   0.76   41.1  69.3 -22.2    79.1 0.582 0.555 0.233  0.0201 
344 5 X2 X3 X5 X7 X9 0.84    0.807   0.76   42.8  70.2 -21.5    80   0.599 0.572 0.24   0.0207 
345 5 X1 X2 X3 X5 X9 0.838   0.804   0.756  43.6  70.5 -21.3    80.4 0.606 0.578 0.243  0.021  
346 5 X1 X3 X5 X6 X9 0.835   0.8     0.734  44.8  71.1 -20.8    80.9 0.618 0.59  0.248  0.0214 
347 5 X2 X4 X5 X7 X8 0.825   0.788   0.73   48.5  72.9 -19.5    82.7 0.655 0.625 0.263  0.0226 
348 5 X2 X4 X6 X7 X8 0.824   0.787   0.725  48.8  73   -19.4    82.8 0.658 0.628 0.264  0.0227 
349 5 X4 X5 X6 X7 X8 0.822   0.785   0.724  49.8  73.4 -19      83.2 0.667 0.636 0.267  0.0231 
350 5 X2 X4 X5 X7 X9 0.821   0.784   0.723  49.9  73.5 -19      83.3 0.668 0.638 0.268  0.0231 
351 5 X2 X4 X6 X7 X9 0.821   0.784   0.722  49.9  73.5 -19      83.3 0.669 0.638 0.268  0.0231 
352 5 X2 X4 X5 X6 X7 0.82    0.782   0.728  50.4  73.7 -18.8    83.5 0.674 0.643 0.27   0.0233 
353 5 X4 X5 X6 X7 X9 0.82    0.782   0.724  50.5  73.8 -18.8    83.6 0.675 0.644 0.27   0.0233 
354 5 X2 X4 X5 X6 X8 0.809   0.77    0.723  54.4  75.4 -17.4    85.2 0.713 0.68  0.286  0.0246 
355 5 X2 X4 X5 X6 X9 0.809   0.769   0.718  54.5  75.4 -17.4    85.3 0.714 0.681 0.286  0.0247 
356 5 X2 X3 X5 X8 X9 0.807   0.766   0.726  55.5  75.8 -17.1    85.7 0.723 0.69  0.29   0.025  
357 5 X2 X3 X5 X6 X8 0.807   0.766   0.706  55.5  75.9 -17.1    85.7 0.724 0.691 0.29   0.025  
358 5 X2 X3 X5 X6 X9 0.783   0.738   0.664  64.4  79.3 -14.3    89.1 0.811 0.774 0.325  0.028  
359 5 X2 X3 X6 X8 X9 0.753   0.702   0.622  75.7  83.1 -11.1    92.9 0.923 0.88  0.37   0.0319 
360 5 X2 X5 X6 X8 X9 0.745   0.692   0.612  78.9  84.1 -10.2    94   0.954 0.91  0.382  0.033  
361 5 X5 X6 X7 X8 X9 0.723   0.665   0.559  87.3  86.7 -8.09   96.5 1.04  0.989 0.416  0.0358 
362 5 X2 X5 X7 X8 X9 0.713   0.654   0.57   90.9  87.7 -7.23   97.5 1.07  1.02  0.43   0.0371 
363 5 X1 X2 X5 X8 X9 0.713   0.654   0.598  91    87.7 -7.21   97.5 1.07  1.02  0.43   0.0371 
364 5 X1 X5 X6 X8 X9 0.703   0.641   0.539  95    88.8 -6.27   98.6 1.11  1.06  0.446  0.0384 
365 5 X1 X5 X7 X8 X9 0.685   0.62    0.526  102    90.5 -4.78   100   1.18  1.12  0.472  0.0407 
366 5 X1 X2 X5 X6 X8 0.683   0.616   0.563  103    90.7 -4.56   101   1.19  1.13  0.476  0.0411 
367 5 X2 X5 X6 X7 X8 0.682   0.616   0.527  103    90.7 -4.55   101   1.19  1.13  0.476  0.0411 
368 5 X3 X5 X6 X8 X9 0.672   0.603   0.512  107    91.7 -3.67   102   1.23  1.17  0.493  0.0425 
369 5 X1 X2 X5 X7 X8 0.666   0.596   0.529  109    92.3 -3.21   102   1.25  1.19  0.501  0.0432 
370 5 X1 X2 X5 X6 X9 0.661   0.591   0.519  111    92.7 -2.84   102   1.27  1.21  0.508  0.0438 
371 5 X2 X5 X6 X7 X9 0.661   0.59    0.483  111    92.7 -2.82   103   1.27  1.21  0.509  0.0438 
372 5 X1 X2 X5 X6 X7 0.661   0.59    0.511  111    92.7 -2.82   103   1.27  1.21  0.509  0.0438 
373 5 X1 X2 X5 X7 X9 0.648   0.574   0.496  116    93.8 -1.8    104   1.32  1.26  0.528  0.0456 
374 5 X1 X5 X6 X7 X8 0.643   0.569   0.486  118    94.2 -1.45   104   1.34  1.27  0.535  0.0462 
375 5 X1 X5 X6 X7 X9 0.627   0.55    0.448  124    95.5 -0.292  105   1.4   1.33  0.559  0.0482 
376 5 X1 X2 X6 X8 X9 0.6     0.517   0.411  134    97.6 1.58   107   1.5   1.43  0.599  0.0517 
377 5 X1 X2 X7 X8 X9 0.599   0.516   0.411  134    97.7 1.64   108   1.5   1.43  0.601  0.0518 
378 5 X1 X6 X7 X8 X9 0.588   0.502   0.359  139    98.5 2.39   108   1.54  1.47  0.618  0.0533 
379 5 X1 X2 X6 X7 X8 0.561   0.469   0.362  149    100   4.14   110   1.64  1.57  0.659  0.0568 
380 5 X1 X2 X6 X7 X9 0.552   0.459   0.329  152    101   4.68   111   1.68  1.6   0.672  0.0579 
381 5 X2 X6 X7 X8 X9 0.487   0.38    0.214  177    105   8.4    115   1.92  1.83  0.769  0.0663 
382 6 X3 X4 X6 X7 X8 X9 0.947   0.933   0.908  4.23 39.2 -39.7    50.4 0.217 0.204 0.0857 0.00751
383 6 X1 X3 X6 X7 X8 X9 0.943   0.928   0.9    5.73 41.3 -38.8    52.5 0.233 0.219 0.092  0.00807
384 6 X1 X2 X4 X5 X8 X9 0.937   0.92    0.901  8.01 44.3 -37.5    55.5 0.258 0.242 0.102  0.00891
385 6 X1 X3 X4 X7 X8 X9 0.936   0.92    0.893  8.17 44.5 -37.4    55.7 0.26  0.243 0.102  0.00897
386 6 X1 X4 X5 X7 X8 X9 0.934   0.916   0.892  9.17 45.7 -36.8    56.9 0.27  0.254 0.107  0.00935
387 6 X3 X4 X5 X7 X8 X9 0.933   0.915   0.883  9.58 46.2 -36.6    57.4 0.275 0.258 0.108  0.0095 
388 6 X2 X3 X4 X7 X8 X9 0.932   0.914   0.883  9.84 46.5 -36.4    57.7 0.278 0.26  0.109  0.00959
389 6 X1 X2 X3 X4 X8 X9 0.93    0.911   0.888  10.7  47.5 -36      58.7 0.287 0.269 0.113  0.0099 
390 6 X1 X4 X5 X6 X8 X9 0.93    0.911   0.888  10.8  47.6 -35.9    58.8 0.288 0.27  0.113  0.00994
391 6 X1 X3 X4 X5 X8 X9 0.928   0.91    0.887  11.3  48.1 -35.7    59.3 0.293 0.275 0.115  0.0101 
392 6 X1 X3 X4 X6 X8 X9 0.925   0.906   0.876  12.3  49.3 -35.1    60.5 0.304 0.285 0.12   0.0105 
393 6 X2 X3 X4 X6 X8 X9 0.925   0.906   0.877  12.4  49.3 -35      60.5 0.305 0.286 0.12   0.0105 
394 6 X1 X2 X4 X7 X8 X9 0.925   0.905   0.872  12.7  49.6 -34.9    60.8 0.308 0.289 0.121  0.0107 
395 6 X1 X2 X4 X6 X8 X9 0.924   0.905   0.876  12.7  49.7 -34.8    60.9 0.309 0.289 0.122  0.0107 
396 6 X1 X4 X6 X7 X8 X9 0.923   0.902   0.868  13.4  50.4 -34.5    61.6 0.316 0.297 0.125  0.0109 
397 6 X3 X5 X6 X7 X8 X9 0.921   0.9     0.873  14    51   -34.1    62.2 0.323 0.303 0.127  0.0112 
398 6 X3 X4 X5 X6 X8 X9 0.921   0.9     0.869  14.2  51.2 -34.1    62.4 0.324 0.304 0.128  0.0112 
399 6 X2 X3 X4 X5 X8 X9 0.92    0.899   0.866  14.5  51.5 -33.9    62.7 0.328 0.308 0.129  0.0113 
400 6 X1 X3 X5 X7 X8 X9 0.913   0.89    0.847  17.1  53.9 -32.6    65.1 0.355 0.333 0.14   0.0123 
401 6 X1 X2 X3 X7 X8 X9 0.909   0.885   0.847  18.5  55.2 -31.8    66.4 0.371 0.348 0.146  0.0128 
402 6 X1 X3 X5 X6 X7 X8 0.909   0.885   0.839  18.6  55.3 -31.8    66.5 0.372 0.349 0.147  0.0129 
403 6 X1 X3 X4 X6 X7 X8 0.908   0.884   0.841  19.1  55.7 -31.5    66.9 0.377 0.354 0.149  0.013  
404 6 X3 X4 X5 X6 X7 X8 0.907   0.883   0.838  19.2  55.7 -31.5    66.9 0.378 0.354 0.149  0.0131 
405 6 X2 X3 X4 X6 X7 X8 0.907   0.883   0.843  19.2  55.8 -31.5    67   0.378 0.354 0.149  0.0131 
406 6 X1 X2 X3 X6 X7 X8 0.906   0.881   0.841  19.8  56.3 -31.2    67.5 0.385 0.361 0.152  0.0133 
407 6 X2 X3 X6 X7 X8 X9 0.905   0.881   0.846  20    56.4 -31.1    67.6 0.386 0.362 0.152  0.0133 
408 6 X1 X2 X3 X4 X7 X8 0.901   0.875   0.833  21.5  57.7 -30.3    68.9 0.403 0.378 0.159  0.0139 
409 6 X2 X3 X4 X5 X7 X8 0.901   0.875   0.834  21.6  57.7 -30.3    68.9 0.404 0.378 0.159  0.0139 
410 6 X1 X3 X4 X5 X7 X8 0.901   0.875   0.836  21.6  57.8 -30.3    69   0.404 0.379 0.159  0.014  
411 6 X1 X2 X3 X5 X6 X7 0.9     0.874   0.839  21.9  58   -30.2    69.2 0.407 0.382 0.16   0.0141 
412 6 X1 X3 X4 X5 X6 X7 0.9     0.874   0.836  22.1  58.2 -30      69.4 0.409 0.384 0.161  0.0142 
413 6 X3 X4 X5 X6 X7 X9 0.899   0.872   0.829  22.6  58.5 -29.8    69.7 0.414 0.388 0.163  0.0143 
414 6 X2 X3 X4 X5 X6 X7 0.899   0.872   0.837  22.6  58.5 -29.8    69.7 0.414 0.388 0.163  0.0143 
415 6 X1 X3 X5 X6 X7 X9 0.898   0.872   0.824  22.6  58.5 -29.8    69.8 0.415 0.389 0.163  0.0143 
416 6 X2 X3 X4 X6 X7 X9 0.898   0.871   0.833  22.9  58.8 -29.7    70   0.418 0.392 0.165  0.0145 
417 6 X1 X2 X3 X4 X6 X7 0.898   0.871   0.831  22.9  58.8 -29.7    70   0.418 0.392 0.165  0.0145 
418 6 X1 X3 X4 X6 X7 X9 0.897   0.871   0.83   23    58.8 -29.6    70   0.418 0.392 0.165  0.0145 
419 6 X1 X2 X3 X4 X5 X8 0.895   0.868   0.836  23.7  59.4 -29.3    70.6 0.427 0.4   0.168  0.0148 
420 6 X1 X2 X3 X4 X6 X8 0.895   0.867   0.828  24    59.6 -29.2    70.8 0.43  0.403 0.169  0.0148 
421 6 X1 X2 X3 X4 X5 X7 0.894   0.866   0.83   24.3  59.8 -29      71   0.433 0.406 0.171  0.015  
422 6 X1 X3 X4 X5 X7 X9 0.894   0.866   0.828  24.3  59.8 -29      71   0.433 0.406 0.171  0.015  
423 6 X2 X3 X4 X5 X7 X9 0.894   0.866   0.826  24.3  59.8 -29      71   0.433 0.406 0.171  0.015  
424 6 X1 X2 X3 X6 X7 X9 0.894   0.866   0.825  24.5  59.9 -28.9    71.2 0.435 0.408 0.171  0.015  
425 6 X1 X2 X3 X4 X7 X9 0.893   0.866   0.826  24.5  60   -28.9    71.2 0.435 0.408 0.172  0.015  
426 6 X1 X2 X3 X5 X7 X8 0.892   0.863   0.82   25.2  60.5 -28.6    71.7 0.443 0.415 0.174  0.0153 
427 6 X2 X3 X4 X5 X6 X8 0.891   0.863   0.82   25.4  60.6 -28.5    71.8 0.445 0.417 0.175  0.0154 
428 6 X1 X2 X4 X5 X7 X8 0.891   0.862   0.826  25.5  60.7 -28.5    71.9 0.445 0.418 0.175  0.0154 
429 6 X1 X2 X4 X5 X6 X8 0.891   0.862   0.828  25.5  60.7 -28.5    71.9 0.445 0.418 0.176  0.0154 
430 6 X1 X2 X3 X4 X5 X9 0.887   0.858   0.824  26.9  61.7 -27.8    72.9 0.461 0.432 0.182  0.0159 
431 6 X1 X2 X3 X4 X5 X6 0.887   0.858   0.823  26.9  61.7 -27.8    72.9 0.461 0.432 0.182  0.0159 
432 6 X1 X2 X3 X4 X6 X9 0.887   0.857   0.817  27    61.8 -27.8    73   0.462 0.434 0.182  0.016  
433 6 X1 X3 X4 X5 X6 X8 0.885   0.855   0.808  27.6  62.2 -27.5    73.4 0.468 0.439 0.185  0.0162 
434 6 X1 X2 X3 X6 X8 X9 0.884   0.854   0.814  27.9  62.4 -27.4    73.6 0.472 0.442 0.186  0.0163 
435 6 X1 X2 X3 X5 X7 X9 0.884   0.854   0.811  28    62.5 -27.3    73.7 0.473 0.443 0.186  0.0163 
436 6 X1 X2 X4 X5 X6 X9 0.883   0.853   0.818  28.3  62.7 -27.2    73.9 0.476 0.447 0.188  0.0165 
437 6 X1 X2 X4 X5 X7 X9 0.883   0.853   0.815  28.4  62.7 -27.1    74   0.477 0.447 0.188  0.0165 
438 6 X1 X2 X4 X5 X6 X7 0.883   0.852   0.817  28.5  62.8 -27.1    74   0.478 0.448 0.188  0.0165 
439 6 X2 X3 X4 X5 X6 X9 0.882   0.852   0.806  28.7  63   -27      74.2 0.481 0.451 0.189  0.0166 
440 6 X1 X2 X4 X6 X7 X8 0.882   0.851   0.805  28.8  63   -27      74.2 0.481 0.452 0.19   0.0166 
441 6 X1 X4 X5 X6 X7 X8 0.881   0.85    0.803  29.1  63.2 -26.9    74.4 0.484 0.454 0.191  0.0167 
442 6 X2 X3 X5 X6 X7 X8 0.881   0.85    0.803  29.3  63.4 -26.7    74.6 0.487 0.457 0.192  0.0168 
443 6 X2 X3 X5 X7 X8 X9 0.88    0.848   0.797  29.8  63.7 -26.5    74.9 0.492 0.461 0.194  0.017  
444 6 X1 X3 X4 X5 X6 X9 0.879   0.847   0.8    30    63.8 -26.5    75   0.494 0.463 0.195  0.0171 
445 6 X1 X2 X4 X6 X7 X9 0.878   0.846   0.801  30.5  64.1 -26.2    75.3 0.5   0.469 0.197  0.0173 
446 6 X1 X4 X5 X6 X7 X9 0.877   0.844   0.803  30.9  64.4 -26.1    75.6 0.504 0.472 0.198  0.0174 
447 6 X4 X5 X6 X7 X8 X9 0.875   0.843   0.787  31.3  64.7 -25.9    75.9 0.509 0.477 0.2    0.0176 
448 6 X2 X4 X5 X7 X8 X9 0.873   0.84    0.78   32.3  65.2 -25.5    76.4 0.518 0.486 0.204  0.0179 
449 6 X2 X4 X6 X7 X8 X9 0.872   0.838   0.778  32.7  65.5 -25.3    76.7 0.523 0.491 0.206  0.0181 
450 6 X2 X4 X5 X6 X8 X9 0.872   0.838   0.786  32.8  65.6 -25.3    76.8 0.524 0.492 0.207  0.0181 
451 6 X1 X3 X5 X6 X8 X9 0.871   0.837   0.783  33.2  65.8 -25.1    77   0.528 0.495 0.208  0.0183 
452 6 X1 X2 X3 X5 X8 X9 0.865   0.83    0.789  35.2  67   -24.3    78.2 0.55  0.516 0.217  0.019  
453 6 X2 X3 X5 X6 X7 X9 0.864   0.828   0.781  35.8  67.3 -24      78.5 0.556 0.521 0.219  0.0192 
454 6 X1 X2 X3 X5 X6 X8 0.859   0.823   0.769  37.4  68.3 -23.4    79.5 0.574 0.538 0.226  0.0198 
455 6 X1 X2 X3 X5 X6 X9 0.847   0.807   0.745  42.2  70.8 -21.6    82   0.625 0.586 0.246  0.0216 
456 6 X2 X3 X5 X6 X8 X9 0.836   0.793   0.739  46.3  72.9 -20      84.1 0.669 0.628 0.264  0.0231 
457 6 X2 X4 X5 X6 X7 X8 0.825   0.779   0.707  50.5  74.9 -18.6    86.1 0.715 0.67  0.282  0.0247 
458 6 X2 X4 X5 X6 X7 X9 0.821   0.775   0.702  51.9  75.5 -18.1    86.7 0.729 0.684 0.287  0.0252 
459 6 X2 X5 X6 X7 X8 X9 0.748   0.682   0.563  79.9  85.8 -9.87   97   1.03  0.966 0.406  0.0356 
460 6 X1 X2 X5 X6 X8 X9 0.745   0.679   0.597  80.8  86.1 -9.62   97.3 1.04  0.976 0.41   0.036  
461 6 X1 X5 X6 X7 X8 X9 0.724   0.652   0.528  88.9  88.5 -7.63   99.7 1.13  1.06  0.444  0.0389 
462 6 X1 X2 X5 X7 X8 X9 0.714   0.639   0.548  92.8  89.6 -6.7    101   1.17  1.1   0.461  0.0404 
463 6 X1 X2 X5 X6 X7 X8 0.683   0.6     0.506  105    92.7 -4.06   104   1.3   1.22  0.511  0.0448 
464 6 X1 X2 X5 X6 X7 X9 0.661   0.573   0.461  113    94.7 -2.37   106   1.38  1.3   0.545  0.0478 
465 6 X1 X2 X6 X7 X8 X9 0.604   0.5     0.352  135    99.4 1.75   111   1.62  1.52  0.638  0.0559 
466 7 X3 X4 X5 X6 X7 X8 X9 0.947   0.93    0.902  6.06 40.9 -36.8    53.5 0.236 0.217 0.0912 0.00816
467 7 X2 X3 X4 X6 X7 X8 X9 0.947   0.93    0.902  6.14 41   -36.8    53.6 0.237 0.218 0.0916 0.00819
468 7 X1 X3 X4 X6 X7 X8 X9 0.947   0.93    0.901  6.22 41.1 -36.7    53.7 0.238 0.219 0.0919 0.00822
469 7 X1 X2 X3 X6 X7 X8 X9 0.944   0.927   0.897  7.13 42.5 -36.3    55.1 0.249 0.229 0.0961 0.00859
470 7 X1 X3 X5 X6 X7 X8 X9 0.943   0.925   0.892  7.69 43.2 -36.1    55.9 0.255 0.235 0.0986 0.00882
471 7 X1 X2 X4 X5 X6 X8 X9 0.939   0.919   0.897  9.28 45.4 -35.3    58   0.274 0.252 0.106  0.00947
472 7 X1 X2 X4 X5 X7 X8 X9 0.938   0.918   0.891  9.55 45.7 -35.2    58.3 0.277 0.255 0.107  0.00958
473 7 X1 X3 X4 X5 X7 X8 X9 0.937   0.917   0.889  9.81 46   -35.1    58.7 0.28  0.258 0.108  0.00968
474 7 X1 X2 X3 X4 X5 X8 X9 0.937   0.917   0.895  9.97 46.2 -35      58.9 0.282 0.259 0.109  0.00975
475 7 X1 X2 X3 X4 X7 X8 X9 0.937   0.916   0.883  10.1  46.4 -34.9    59   0.283 0.26  0.109  0.00979
476 7 X1 X4 X5 X6 X7 X8 X9 0.936   0.916   0.888  10.2  46.5 -34.9    59.1 0.284 0.261 0.11   0.00982
477 7 X2 X3 X4 X5 X7 X8 X9 0.933   0.911   0.874  11.6  48.2 -34.2    60.8 0.301 0.276 0.116  0.0104 
478 7 X1 X2 X3 X4 X6 X8 X9 0.931   0.909   0.877  12.2  49   -33.9    61.6 0.309 0.284 0.119  0.0107 
479 7 X1 X3 X4 X5 X6 X8 X9 0.93    0.907   0.878  12.8  49.6 -33.7    62.2 0.315 0.289 0.122  0.0109 
480 7 X2 X3 X4 X5 X6 X8 X9 0.926   0.902   0.868  14.3  51.2 -33      63.8 0.333 0.306 0.128  0.0115 
481 7 X1 X2 X4 X6 X7 X8 X9 0.925   0.9     0.861  14.7  51.6 -32.8    64.2 0.337 0.31  0.13   0.0117 
482 7 X2 X3 X5 X6 X7 X8 X9 0.921   0.897   0.864  15.8  52.8 -32.2    65.4 0.351 0.323 0.136  0.0121 
483 7 X1 X2 X3 X5 X7 X8 X9 0.914   0.887   0.838  18.6  55.4 -31      68   0.383 0.352 0.148  0.0132 
484 7 X1 X2 X3 X5 X6 X7 X8 0.91    0.881   0.834  20.3  57   -30.2    69.6 0.403 0.371 0.156  0.0139 
485 7 X1 X3 X4 X5 X6 X7 X8 0.909   0.88    0.832  20.5  57.1 -30.1    69.7 0.405 0.373 0.157  0.014  
486 7 X1 X2 X3 X4 X6 X7 X8 0.908   0.878   0.826  21.1  57.7 -29.8    70.3 0.413 0.379 0.159  0.0143 
487 7 X2 X3 X4 X5 X6 X7 X8 0.908   0.878   0.83   21.1  57.7 -29.8    70.3 0.413 0.38  0.159  0.0143 
488 7 X1 X2 X3 X4 X5 X7 X8 0.901   0.87    0.82   23.4  59.6 -28.8    72.2 0.441 0.405 0.17   0.0152 
489 7 X1 X2 X3 X4 X5 X6 X7 0.9     0.869   0.824  23.8  59.9 -28.6    72.5 0.445 0.409 0.172  0.0154 
490 7 X1 X2 X3 X5 X6 X7 X9 0.9     0.869   0.822  23.9  60   -28.6    72.6 0.446 0.41  0.172  0.0154 
491 7 X1 X3 X4 X5 X6 X7 X9 0.9     0.868   0.821  24.1  60.2 -28.5    72.8 0.448 0.412 0.173  0.0155 
492 7 X2 X3 X4 X5 X6 X7 X9 0.899   0.866   0.821  24.6  60.5 -28.3    73.1 0.454 0.417 0.175  0.0157 
493 7 X1 X2 X3 X4 X6 X7 X9 0.898   0.865   0.815  24.9  60.8 -28.1    73.4 0.458 0.421 0.177  0.0158 
494 7 X1 X2 X3 X4 X5 X6 X8 0.896   0.863   0.817  25.6  61.3 -27.8    73.9 0.466 0.428 0.18   0.0161 
495 7 X1 X2 X3 X4 X5 X7 X9 0.894   0.86    0.814  26.3  61.8 -27.5    74.4 0.474 0.435 0.183  0.0164 
496 7 X1 X2 X4 X5 X6 X7 X8 0.891   0.856   0.809  27.5  62.7 -27      75.3 0.488 0.448 0.188  0.0169 
497 7 X1 X2 X3 X4 X5 X6 X9 0.887   0.851   0.804  28.9  63.7 -26.4    76.3 0.505 0.464 0.195  0.0174 
498 7 X1 X2 X3 X5 X6 X8 X9 0.884   0.848   0.8    29.9  64.4 -26      77   0.516 0.475 0.2    0.0178 
499 7 X1 X2 X4 X5 X6 X7 X9 0.883   0.846   0.8    30.3  64.7 -25.8    77.3 0.522 0.48  0.202  0.018  
500 7 X2 X4 X5 X6 X7 X8 X9 0.876   0.837   0.768  33.1  66.5 -24.7    79.1 0.554 0.51  0.214  0.0192 
501 7 X1 X2 X5 X6 X7 X8 X9 0.748   0.668   0.542  81.7  87.8 -9.28   100   1.13  1.04  0.435  0.0389 
502 8 X2 X3 X4 X5 X6 X7 X8 X9 0.947   0.927   0.896  8    42.8 -33.8    56.8 0.259 0.233 0.0977 0.00895
503 8 X1 X3 X4 X5 X6 X7 X8 X9 0.947   0.927   0.897  8.02 42.8 -33.8    56.9 0.259 0.233 0.0978 0.00896
504 8 X1 X2 X3 X4 X6 X7 X8 X9 0.947   0.927   0.892  8.07 42.9 -33.8    56.9 0.26  0.233 0.0981 0.00898
505 8 X1 X2 X3 X5 X6 X7 X8 X9 0.945   0.924   0.89   8.89 44.1 -33.5    58.1 0.27  0.243 0.102  0.00934
506 8 X1 X2 X4 X5 X6 X7 X8 X9 0.941   0.918   0.888  10.6  46.4 -33      60.4 0.292 0.262 0.11   0.0101 
507 8 X1 X2 X3 X4 X5 X7 X8 X9 0.939   0.915   0.881  11.3  47.3 -32.7    61.3 0.301 0.27  0.114  0.0104 
508 8 X1 X2 X3 X4 X5 X6 X8 X9 0.939   0.915   0.886  11.3  47.4 -32.7    61.4 0.301 0.271 0.114  0.0104 
509 8 X1 X2 X3 X4 X5 X6 X7 X8 0.91    0.875   0.817  22.3  59   -28.4    73   0.444 0.399 0.168  0.0153 
510 8 X1 X2 X3 X4 X5 X6 X7 X9 0.9     0.863   0.807  25.8  61.9 -27      75.9 0.49  0.44  0.185  0.0169 
511 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 0.947   0.924   0.886  10    44.8 -30.8    60.2 0.286 0.25  0.105  0.00989
plot(k)

# All Possible Regression for X4 eliminated model #
k <- ols_step_all_possible(model_wf_rm4_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X1 0.527   0.51    0.466  155    94.7 4.85    98.9 1.29  1.29  0.541  0.0447 
2 1 X5 0.523   0.506   0.472  156    94.9 5.07    99.2 1.3   1.3   0.545  0.0451 
3 1 X2 0.433   0.413   0.346  190    100   10       104   1.55  1.54  0.648  0.0535 
4 1 X7 0.35    0.327   0.272  222    104   14       108   1.78  1.77  0.743  0.0614 
5 1 X3 0.227   0.199   0.133  269    109   19.1     114   2.11  2.1   0.884  0.073  
6 1 X8 0.0407  0.00648 -0.0793 340    116   25.4     120   2.62  2.61  1.1    0.0906 
7 1 X9 0.0176  -0.0175  -0.104  349    117   26.1     121   2.68  2.67  1.12   0.0928 
8 1 X6 0.00292 -0.0327  -0.125  355    117   26.5     121   2.72  2.71  1.14   0.0942 
9 2 X3 X7 0.832   0.819   0.798  40.3  65.7 -23.2     71.3 0.495 0.49  0.206  0.0171 
10 2 X1 X3 0.812   0.798   0.775  47.8  69   -20.3     74.6 0.553 0.547 0.23   0.0191 
11 2 X2 X3 0.709   0.688   0.644  87    82.1 -8.51    87.7 0.855 0.845 0.355  0.0295 
12 2 X2 X5 0.646   0.62    0.592  111    88   -3.03    93.6 1.04  1.03  0.433  0.036  
13 2 X1 X5 0.604   0.575   0.539  127    91.3 0.104   96.9 1.16  1.15  0.484  0.0402 
14 2 X5 X7 0.6     0.57    0.536  129    91.6 0.4     97.3 1.18  1.16  0.489  0.0407 
15 2 X3 X5 0.564   0.531   0.489  143    94.3 2.87    99.9 1.28  1.27  0.533  0.0444 
16 2 X5 X8 0.553   0.52    0.479  147    95   3.56    101   1.32  1.3   0.547  0.0455 
17 2 X5 X6 0.544   0.511   0.451  150    95.6 4.09    101   1.34  1.32  0.557  0.0463 
18 2 X1 X2 0.543   0.509   0.477  151    95.7 4.17    101   1.34  1.33  0.558  0.0465 
19 2 X1 X8 0.533   0.498   0.44   155    96.3 4.82    102   1.38  1.36  0.571  0.0475 
20 2 X1 X9 0.528   0.493   0.42   156    96.6 5.11    102   1.39  1.37  0.577  0.048  
21 2 X1 X6 0.528   0.493   0.429  156    96.7 5.12    102   1.39  1.37  0.577  0.048  
22 2 X1 X7 0.527   0.492   0.437  157    96.7 5.17    102   1.39  1.38  0.578  0.0481 
23 2 X5 X9 0.526   0.491   0.439  157    96.7 5.21    102   1.39  1.38  0.579  0.0482 
24 2 X2 X8 0.451   0.41    0.335  186    101   9.41    107   1.62  1.6   0.671  0.0558 
25 2 X2 X7 0.44    0.398   0.318  190    102   10       107   1.65  1.63  0.685  0.057  
26 2 X2 X9 0.435   0.393   0.301  192    102   10.2     108   1.66  1.64  0.69   0.0574 
27 2 X2 X6 0.433   0.391   0.309  192    102   10.3     108   1.67  1.65  0.693  0.0576 
28 2 X7 X8 0.361   0.314   0.249  220    106   13.8     111   1.88  1.86  0.781  0.065  
29 2 X6 X7 0.35    0.302   0.224  224    106   14.2     112   1.91  1.89  0.794  0.0661 
30 2 X7 X9 0.35    0.302   0.224  224    106   14.3     112   1.91  1.89  0.794  0.0661 
31 2 X3 X8 0.282   0.229   0.133  250    109   17.1     115   2.11  2.09  0.877  0.073  
32 2 X3 X9 0.265   0.21    0.107  257    110   17.8     116   2.16  2.14  0.899  0.0748 
33 2 X3 X6 0.227   0.17    0.0769 271    111   19.3     117   2.27  2.25  0.945  0.0786 
34 2 X8 X9 0.051   -0.0193  -0.168  339    118   25.3     123   2.79  2.76  1.16   0.0965 
35 2 X6 X8 0.0423  -0.0286  -0.138  342    118   25.5     123   2.82  2.79  1.17   0.0974 
36 2 X6 X9 0.0207  -0.0519  -0.162  350    119   26.2     124   2.88  2.85  1.2    0.0996 
37 3 X1 X3 X7 0.872   0.857   0.831  27    59.5 -29       66.5 0.408 0.399 0.168  0.0141 
38 3 X3 X6 X7 0.855   0.838   0.809  33.4  63.2 -26       70.2 0.461 0.452 0.19   0.0159 
39 3 X3 X7 X8 0.845   0.828   0.8    37.1  65.1 -24.4     72.1 0.491 0.481 0.202  0.017  
40 3 X2 X3 X7 0.835   0.816   0.783  41    67.1 -22.8     74.1 0.524 0.513 0.216  0.0181 
41 3 X3 X5 X7 0.835   0.815   0.789  41.2  67.2 -22.7     74.2 0.526 0.515 0.216  0.0182 
42 3 X3 X7 X9 0.833   0.813   0.788  41.9  67.5 -22.4     74.5 0.532 0.521 0.219  0.0184 
43 3 X1 X2 X3 0.831   0.812   0.795  42.5  67.8 -22.2     74.8 0.537 0.526 0.221  0.0186 
44 3 X1 X3 X5 0.826   0.806   0.773  44.5  68.7 -21.4     75.7 0.553 0.542 0.228  0.0191 
45 3 X1 X3 X6 0.826   0.805   0.769  44.6  68.8 -21.4     75.8 0.554 0.543 0.228  0.0192 
46 3 X1 X3 X8 0.824   0.804   0.767  45.3  69.1 -21.1     76.1 0.56  0.548 0.23   0.0193 
47 3 X1 X3 X9 0.813   0.791   0.75   49.5  70.9 -19.6     77.9 0.595 0.583 0.245  0.0206 
48 3 X2 X3 X5 0.761   0.734   0.7    69.2  78.2 -13.2     85.2 0.758 0.743 0.312  0.0262 
49 3 X2 X3 X8 0.737   0.707   0.649  78.4  81.1 -10.7     88.1 0.835 0.818 0.344  0.0289 
50 3 X2 X3 X9 0.721   0.689   0.62   84.4  82.8 -9.1     89.8 0.885 0.867 0.364  0.0306 
51 3 X2 X3 X6 0.714   0.681   0.623  87.1  83.6 -8.43    90.6 0.908 0.889 0.374  0.0314 
52 3 X2 X5 X8 0.665   0.627   0.594  106    88.3 -4.13    95.3 1.06  1.04  0.437  0.0367 
53 3 X2 X5 X6 0.66    0.621   0.575  108    88.7 -3.73    95.7 1.08  1.06  0.444  0.0373 
54 3 X1 X2 X5 0.647   0.606   0.579  113    89.9 -2.67    96.9 1.12  1.1   0.461  0.0387 
55 3 X2 X5 X9 0.647   0.606   0.557  113    89.9 -2.63    96.9 1.12  1.1   0.462  0.0388 
56 3 X2 X5 X7 0.646   0.605   0.551  113    90   -2.59    97   1.12  1.1   0.463  0.0389 
57 3 X5 X6 X7 0.62    0.577   0.516  123    92.1 -0.657   99.1 1.21  1.18  0.496  0.0417 
58 3 X1 X5 X8 0.618   0.573   0.525  124    92.3 -0.443   99.3 1.22  1.19  0.5    0.042  
59 3 X1 X5 X6 0.617   0.573   0.518  124    92.4 -0.39    99.4 1.22  1.19  0.501  0.0421 
60 3 X5 X7 X8 0.616   0.572   0.531  125    92.4 -0.36    99.4 1.22  1.19  0.502  0.0421 
61 3 X1 X5 X7 0.611   0.566   0.518  126    92.8 0.00417 99.8 1.24  1.21  0.508  0.0427 
62 3 X1 X5 X9 0.604   0.559   0.499  129    93.3 0.497   100   1.26  1.23  0.517  0.0435 
63 3 X3 X5 X8 0.601   0.555   0.503  130    93.6 0.743   101   1.27  1.24  0.522  0.0438 
64 3 X5 X7 X9 0.6     0.554   0.502  131    93.6 0.792   101   1.27  1.24  0.523  0.0439 
65 3 X5 X8 X9 0.598   0.551   0.488  132    93.8 0.97    101   1.28  1.25  0.526  0.0442 
66 3 X3 X5 X6 0.588   0.54    0.466  135    94.6 1.64    102   1.31  1.28  0.539  0.0453 
67 3 X1 X8 X9 0.582   0.534   0.44   138    95   2.02    102   1.33  1.3   0.547  0.0459 
68 3 X5 X6 X8 0.578   0.529   0.465  139    95.3 2.31    102   1.34  1.31  0.552  0.0464 
69 3 X3 X5 X9 0.573   0.524   0.461  141    95.6 2.63    103   1.36  1.33  0.558  0.0469 
70 3 X1 X2 X7 0.552   0.5     0.444  149    97.1 3.99    104   1.43  1.4   0.586  0.0493 
71 3 X1 X2 X8 0.55    0.498   0.456  150    97.2 4.07    104   1.43  1.4   0.588  0.0494 
72 3 X5 X6 X9 0.547   0.495   0.415  151    97.4 4.27    104   1.44  1.41  0.592  0.0498 
73 3 X1 X2 X6 0.544   0.491   0.441  152    97.6 4.48    105   1.45  1.42  0.597  0.0501 
74 3 X1 X2 X9 0.544   0.491   0.43   152    97.6 4.48    105   1.45  1.42  0.597  0.0501 
75 3 X1 X6 X8 0.534   0.48    0.401  156    98.3 5.08    105   1.48  1.45  0.61   0.0512 
76 3 X1 X7 X8 0.533   0.479   0.409  156    98.3 5.13    105   1.48  1.45  0.611  0.0513 
77 3 X1 X6 X9 0.529   0.475   0.376  158    98.6 5.38    106   1.5   1.47  0.616  0.0517 
78 3 X1 X7 X9 0.528   0.473   0.389  158    98.6 5.43    106   1.5   1.47  0.617  0.0518 
79 3 X1 X6 X7 0.528   0.473   0.394  158    98.6 5.44    106   1.5   1.47  0.618  0.0519 
80 3 X2 X8 X9 0.476   0.415   0.295  178    102   8.39    109   1.67  1.63  0.686  0.0576 
81 3 X2 X7 X8 0.455   0.392   0.3    186    103   9.51    110   1.73  1.7   0.713  0.0599 
82 3 X2 X6 X8 0.451   0.388   0.295  188    103   9.69    110   1.74  1.71  0.718  0.0603 
83 3 X2 X7 X9 0.44    0.376   0.271  192    104   10.2     111   1.78  1.74  0.732  0.0615 
84 3 X2 X6 X7 0.44    0.375   0.27   192    104   10.3     111   1.78  1.74  0.733  0.0616 
85 3 X2 X6 X9 0.435   0.37    0.259  194    104   10.5     111   1.79  1.76  0.738  0.062  
86 3 X7 X8 X9 0.408   0.34    0.216  204    105   11.8     112   1.88  1.84  0.774  0.065  
87 3 X6 X7 X8 0.361   0.288   0.199  222    108   14       115   2.03  1.99  0.835  0.0701 
88 3 X6 X7 X9 0.35    0.276   0.168  226    108   14.5     115   2.06  2.02  0.849  0.0713 
89 3 X3 X6 X8 0.283   0.201   0.0775 252    111   17.3     118   2.28  2.23  0.937  0.0787 
90 3 X3 X8 X9 0.283   0.2     0.0776 252    111   17.3     118   2.28  2.23  0.937  0.0787 
91 3 X3 X6 X9 0.265   0.18    0.047  259    112   18       119   2.34  2.29  0.961  0.0807 
92 3 X6 X8 X9 0.0515  -0.0579  -0.242  340    120   25.4     127   3.01  2.95  1.24   0.104  
93 4 X1 X3 X6 X7 0.893   0.876   0.845  20.8  56   -31.9     64.5 0.368 0.356 0.15   0.0127 
94 4 X1 X3 X5 X7 0.883   0.864   0.833  24.8  58.8 -29.8     67.3 0.404 0.391 0.164  0.014  
95 4 X1 X3 X7 X8 0.882   0.864   0.831  24.9  58.9 -29.7     67.3 0.405 0.392 0.165  0.014  
96 4 X1 X3 X7 X9 0.872   0.852   0.818  28.9  61.5 -27.8     69.9 0.441 0.426 0.179  0.0152 
97 4 X1 X2 X3 X7 0.872   0.851   0.822  29    61.5 -27.8     69.9 0.441 0.427 0.18   0.0153 
98 4 X3 X6 X7 X8 0.871   0.85    0.814  29.4  61.8 -27.6     70.2 0.445 0.431 0.181  0.0154 
99 4 X3 X7 X8 X9 0.869   0.848   0.817  30    62.1 -27.3     70.5 0.45  0.436 0.183  0.0156 
100 4 X3 X5 X6 X7 0.861   0.839   0.802  33.1  63.9 -25.9     72.3 0.478 0.463 0.194  0.0165 
101 4 X2 X3 X6 X7 0.856   0.833   0.8    34.9  64.9 -25.1     73.3 0.494 0.479 0.201  0.0171 
102 4 X3 X6 X7 X9 0.856   0.833   0.797  35.1  65.1 -25       73.5 0.497 0.481 0.202  0.0172 
103 4 X2 X3 X7 X8 0.85    0.826   0.784  37.4  66.3 -24.1     74.7 0.517 0.501 0.21   0.0179 
104 4 X3 X5 X7 X8 0.849   0.825   0.793  37.7  66.5 -23.9     74.9 0.52  0.504 0.212  0.018  
105 4 X1 X3 X8 X9 0.847   0.822   0.788  38.6  66.9 -23.6     75.3 0.528 0.511 0.215  0.0183 
106 4 X1 X2 X3 X8 0.845   0.82    0.792  39.3  67.3 -23.3     75.7 0.535 0.518 0.217  0.0185 
107 4 X1 X2 X3 X6 0.843   0.818   0.786  39.9  67.6 -23.1     76   0.54  0.522 0.22   0.0187 
108 4 X1 X3 X6 X8 0.839   0.813   0.763  41.5  68.3 -22.4     76.8 0.554 0.536 0.225  0.0192 
109 4 X2 X3 X5 X7 0.839   0.813   0.773  41.7  68.4 -22.4     76.9 0.556 0.538 0.226  0.0192 
110 4 X1 X2 X3 X5 0.837   0.811   0.784  42.3  68.8 -22.1     77.2 0.562 0.544 0.228  0.0194 
111 4 X2 X3 X7 X9 0.837   0.81    0.771  42.4  68.8 -22.1     77.2 0.563 0.544 0.229  0.0194 
112 4 X3 X5 X7 X9 0.835   0.809   0.779  42.9  69   -21.9     77.4 0.567 0.548 0.23   0.0196 
113 4 X1 X3 X5 X8 0.835   0.808   0.761  43.1  69.1 -21.8     77.5 0.568 0.55  0.231  0.0196 
114 4 X1 X3 X5 X6 0.834   0.808   0.764  43.3  69.2 -21.7     77.6 0.57  0.552 0.232  0.0197 
115 4 X1 X2 X3 X9 0.833   0.806   0.771  44    69.5 -21.5     77.9 0.577 0.558 0.234  0.0199 
116 4 X1 X3 X5 X9 0.826   0.799   0.744  46.3  70.6 -20.6     79   0.597 0.578 0.243  0.0206 
117 4 X1 X3 X6 X9 0.826   0.798   0.743  46.4  70.7 -20.6     79.1 0.599 0.579 0.243  0.0207 
118 4 X2 X3 X5 X8 0.788   0.754   0.711  61.2  76.7 -15.6     85.1 0.731 0.708 0.297  0.0253 
119 4 X2 X3 X5 X6 0.777   0.742   0.689  65.1  78.1 -14.4     86.5 0.767 0.742 0.312  0.0265 
120 4 X2 X3 X5 X9 0.768   0.731   0.675  68.6  79.3 -13.3     87.7 0.799 0.773 0.325  0.0276 
121 4 X2 X3 X8 X9 0.744   0.703   0.639  77.7  82.2 -10.8     90.7 0.881 0.852 0.358  0.0304 
122 4 X2 X3 X6 X8 0.744   0.703   0.628  77.8  82.3 -10.8     90.7 0.881 0.853 0.358  0.0305 
123 4 X2 X3 X6 X9 0.726   0.682   0.594  84.6  84.3 -9       92.7 0.943 0.912 0.383  0.0326 
124 4 X2 X5 X8 X9 0.712   0.666   0.613  89.8  85.8 -7.71    94.2 0.99  0.958 0.403  0.0342 
125 4 X2 X5 X6 X8 0.682   0.632   0.58   101    88.7 -5.06    97.1 1.09  1.06  0.445  0.0378 
126 4 X5 X7 X8 X9 0.679   0.628   0.561  103    89.1 -4.78    97.5 1.11  1.07  0.449  0.0382 
127 4 X1 X5 X8 X9 0.672   0.62    0.548  105    89.7 -4.22    98.1 1.13  1.09  0.459  0.039  
128 4 X2 X5 X7 X8 0.666   0.612   0.549  108    90.3 -3.68    98.7 1.15  1.11  0.468  0.0398 
129 4 X1 X2 X5 X8 0.665   0.612   0.575  108    90.3 -3.66    98.7 1.15  1.11  0.468  0.0398 
130 4 X1 X2 X5 X6 0.661   0.607   0.562  110    90.7 -3.3     99.1 1.17  1.13  0.475  0.0403 
131 4 X2 X5 X6 X9 0.661   0.607   0.536  110    90.7 -3.3     99.1 1.17  1.13  0.475  0.0404 
132 4 X2 X5 X6 X7 0.661   0.606   0.527  110    90.7 -3.28    99.1 1.17  1.13  0.475  0.0404 
133 4 X1 X2 X5 X9 0.647   0.591   0.541  115    91.9 -2.25    100   1.21  1.17  0.494  0.0419 
134 4 X1 X2 X5 X7 0.647   0.591   0.537  115    91.9 -2.24    100   1.21  1.17  0.494  0.042  
135 4 X2 X5 X7 X9 0.647   0.59    0.515  115    91.9 -2.19    100   1.22  1.18  0.495  0.042  
136 4 X5 X6 X8 X9 0.641   0.583   0.484  117    92.4 -1.74    101   1.24  1.2   0.503  0.0427 
137 4 X5 X6 X7 X8 0.639   0.582   0.515  118    92.6 -1.63    101   1.24  1.2   0.505  0.0429 
138 4 X1 X5 X6 X8 0.633   0.574   0.507  120    93.1 -1.13    102   1.26  1.22  0.514  0.0437 
139 4 X3 X5 X6 X8 0.629   0.57    0.49   122    93.4 -0.887   102   1.28  1.23  0.519  0.0441 
140 4 X3 X5 X8 X9 0.628   0.569   0.513  122    93.5 -0.816   102   1.28  1.24  0.52   0.0442 
141 4 X1 X5 X6 X7 0.627   0.567   0.497  123    93.6 -0.711   102   1.28  1.24  0.522  0.0444 
142 4 X1 X5 X7 X8 0.624   0.564   0.502  123    93.8 -0.534   102   1.29  1.25  0.526  0.0447 
143 4 X5 X6 X7 X9 0.621   0.56    0.474  125    94.1 -0.249   102   1.31  1.26  0.531  0.0452 
144 4 X1 X5 X6 X9 0.617   0.556   0.472  126    94.4 0.0133  103   1.32  1.28  0.536  0.0456 
145 4 X1 X5 X7 X9 0.611   0.549   0.477  128    94.8 0.396   103   1.34  1.29  0.544  0.0462 
146 4 X3 X5 X6 X9 0.596   0.531   0.439  134    95.9 1.46    104   1.39  1.35  0.565  0.0481 
147 4 X1 X2 X8 X9 0.595   0.531   0.451  135    96   1.51    104   1.39  1.35  0.567  0.0482 
148 4 X1 X6 X8 X9 0.588   0.522   0.399  137    96.5 2       105   1.42  1.37  0.577  0.049  
149 4 X1 X7 X8 X9 0.582   0.515   0.407  140    97   2.4     105   1.44  1.39  0.585  0.0497 
150 4 X1 X2 X7 X8 0.56    0.49    0.419  148    98.5 3.8     107   1.51  1.47  0.616  0.0523 
151 4 X1 X2 X6 X7 0.552   0.48    0.392  151    99.1 4.32    107   1.54  1.49  0.627  0.0533 
152 4 X1 X2 X7 X9 0.552   0.48    0.392  151    99.1 4.34    107   1.54  1.49  0.628  0.0534 
153 4 X1 X2 X6 X8 0.551   0.479   0.418  151    99.1 4.37    108   1.55  1.5   0.628  0.0534 
154 4 X1 X2 X6 X9 0.544   0.471   0.387  154    99.6 4.78    108   1.57  1.52  0.638  0.0542 
155 4 X1 X6 X7 X8 0.534   0.46    0.363  158    100   5.4     109   1.6   1.55  0.652  0.0554 
156 4 X1 X6 X7 X9 0.529   0.454   0.336  160    101   5.7     109   1.62  1.57  0.659  0.056  
157 4 X2 X7 X8 X9 0.485   0.403   0.269  177    103   8.17    112   1.77  1.71  0.72   0.0612 
158 4 X2 X6 X8 X9 0.476   0.393   0.251  180    104   8.65    112   1.8   1.74  0.733  0.0623 
159 4 X2 X6 X7 X8 0.455   0.367   0.25   188    105   9.79    113   1.88  1.82  0.763  0.0649 
160 4 X2 X6 X7 X9 0.44    0.351   0.216  194    106   10.5     114   1.93  1.86  0.784  0.0666 
161 4 X6 X7 X8 X9 0.412   0.318   0.158  205    107   11.9     116   2.02  1.96  0.823  0.07   
162 4 X3 X6 X8 X9 0.285   0.17    0.0102 253    113   17.5     121   2.46  2.38  1      0.0851 
163 5 X1 X3 X7 X8 X9 0.909   0.89    0.857  16.8  53.2 -33.6     63.1 0.341 0.325 0.137  0.0118 
164 5 X1 X3 X6 X7 X8 0.906   0.886   0.848  18    54.3 -32.9     64.1 0.353 0.337 0.141  0.0122 
165 5 X3 X6 X7 X8 X9 0.905   0.885   0.856  18.3  54.5 -32.8     64.3 0.355 0.339 0.142  0.0123 
166 5 X1 X3 X5 X6 X7 0.898   0.877   0.841  20.8  56.6 -31.4     66.4 0.381 0.363 0.153  0.0132 
167 5 X1 X2 X3 X6 X7 0.893   0.871   0.84   22.7  58   -30.5     67.8 0.399 0.38  0.16   0.0138 
168 5 X1 X3 X6 X7 X9 0.893   0.871   0.832  22.8  58   -30.4     67.8 0.399 0.381 0.16   0.0138 
169 5 X1 X3 X5 X7 X8 0.891   0.868   0.828  23.7  58.7 -30       68.5 0.408 0.389 0.164  0.0141 
170 5 X1 X2 X3 X5 X7 0.884   0.86    0.827  26.2  60.5 -28.7     70.3 0.433 0.413 0.174  0.015  
171 5 X1 X3 X5 X7 X9 0.883   0.858   0.816  26.7  60.8 -28.5     70.6 0.438 0.418 0.176  0.0151 
172 5 X1 X2 X3 X7 X8 0.883   0.858   0.821  26.9  60.9 -28.4     70.7 0.44  0.419 0.176  0.0152 
173 5 X3 X5 X6 X7 X8 0.878   0.853   0.811  28.5  62   -27.7     71.8 0.456 0.435 0.183  0.0158 
174 5 X3 X5 X7 X8 X9 0.877   0.851   0.82   29.1  62.4 -27.4     72.2 0.462 0.44  0.185  0.016  
175 5 X2 X3 X6 X7 X8 0.873   0.846   0.804  30.6  63.3 -26.7     73.1 0.476 0.454 0.191  0.0165 
176 5 X1 X2 X3 X7 X9 0.872   0.845   0.807  30.9  63.5 -26.6     73.3 0.479 0.457 0.192  0.0166 
177 5 X2 X3 X7 X8 X9 0.872   0.845   0.798  31.1  63.6 -26.5     73.4 0.481 0.459 0.193  0.0166 
178 5 X1 X3 X6 X8 X9 0.869   0.842   0.798  32    64.1 -26.1     73.9 0.49  0.467 0.196  0.0169 
179 5 X1 X2 X3 X8 X9 0.864   0.836   0.809  34    65.3 -25.3     75.1 0.509 0.486 0.204  0.0176 
180 5 X2 X3 X5 X6 X7 0.863   0.834   0.794  34.3  65.5 -25.1     75.3 0.513 0.489 0.206  0.0177 
181 5 X3 X5 X6 X7 X9 0.862   0.833   0.79   34.8  65.8 -24.9     75.6 0.518 0.494 0.207  0.0179 
182 5 X1 X2 X3 X6 X8 0.858   0.829   0.784  36.1  66.5 -24.4     76.3 0.53  0.505 0.212  0.0183 
183 5 X2 X3 X6 X7 X9 0.857   0.828   0.787  36.5  66.7 -24.2     76.5 0.534 0.509 0.214  0.0184 
184 5 X2 X3 X5 X7 X8 0.854   0.823   0.776  37.8  67.4 -23.7     77.3 0.547 0.522 0.219  0.0189 
185 5 X1 X3 X5 X8 X9 0.853   0.822   0.773  38.2  67.7 -23.5     77.5 0.551 0.526 0.221  0.019  
186 5 X1 X2 X3 X5 X8 0.848   0.816   0.778  40    68.6 -22.8     78.4 0.568 0.542 0.228  0.0196 
187 5 X1 X2 X3 X5 X6 0.846   0.814   0.774  40.9  69   -22.5     78.8 0.577 0.55  0.231  0.0199 
188 5 X1 X3 X5 X6 X8 0.845   0.813   0.753  41.2  69.2 -22.3     79   0.58  0.553 0.232  0.02   
189 5 X1 X2 X3 X6 X9 0.844   0.812   0.76   41.4  69.3 -22.2     79.1 0.582 0.555 0.233  0.0201 
190 5 X2 X3 X5 X7 X9 0.84    0.807   0.76   43.2  70.2 -21.6     80   0.599 0.572 0.24   0.0207 
191 5 X1 X2 X3 X5 X9 0.838   0.804   0.756  43.9  70.5 -21.3     80.4 0.606 0.578 0.243  0.021  
192 5 X1 X3 X5 X6 X9 0.835   0.8     0.734  45.1  71.1 -20.9     80.9 0.618 0.59  0.248  0.0214 
193 5 X2 X3 X5 X8 X9 0.807   0.766   0.726  55.9  75.8 -17.1     85.7 0.723 0.69  0.29   0.025  
194 5 X2 X3 X5 X6 X8 0.807   0.766   0.706  55.9  75.9 -17.1     85.7 0.724 0.691 0.29   0.025  
195 5 X2 X3 X5 X6 X9 0.783   0.738   0.664  64.8  79.3 -14.3     89.1 0.811 0.774 0.325  0.028  
196 5 X2 X3 X6 X8 X9 0.753   0.702   0.622  76.2  83.1 -11.1     92.9 0.923 0.88  0.37   0.0319 
197 5 X2 X5 X6 X8 X9 0.745   0.692   0.612  79.4  84.1 -10.3     94   0.954 0.91  0.382  0.033  
198 5 X5 X6 X7 X8 X9 0.723   0.665   0.559  87.9  86.7 -8.11    96.5 1.04  0.989 0.416  0.0358 
199 5 X2 X5 X7 X8 X9 0.713   0.654   0.57   91.5  87.7 -7.25    97.5 1.07  1.02  0.43   0.0371 
200 5 X1 X2 X5 X8 X9 0.713   0.654   0.598  91.5  87.7 -7.24    97.5 1.07  1.02  0.43   0.0371 
201 5 X1 X5 X6 X8 X9 0.703   0.641   0.539  95.6  88.8 -6.29    98.6 1.11  1.06  0.446  0.0384 
202 5 X1 X5 X7 X8 X9 0.685   0.62    0.526  102    90.5 -4.8     100   1.18  1.12  0.472  0.0407 
203 5 X1 X2 X5 X6 X8 0.683   0.616   0.563  103    90.7 -4.57    101   1.19  1.13  0.476  0.0411 
204 5 X2 X5 X6 X7 X8 0.682   0.616   0.527  103    90.7 -4.57    101   1.19  1.13  0.476  0.0411 
205 5 X3 X5 X6 X8 X9 0.672   0.603   0.512  107    91.7 -3.69    102   1.23  1.17  0.493  0.0425 
206 5 X1 X2 X5 X7 X8 0.666   0.596   0.529  110    92.3 -3.23    102   1.25  1.19  0.501  0.0432 
207 5 X1 X2 X5 X6 X9 0.661   0.591   0.519  111    92.7 -2.86    102   1.27  1.21  0.508  0.0438 
208 5 X2 X5 X6 X7 X9 0.661   0.59    0.483  112    92.7 -2.84    103   1.27  1.21  0.509  0.0438 
209 5 X1 X2 X5 X6 X7 0.661   0.59    0.511  112    92.7 -2.84    103   1.27  1.21  0.509  0.0438 
210 5 X1 X2 X5 X7 X9 0.648   0.574   0.496  117    93.8 -1.82    104   1.32  1.26  0.528  0.0456 
211 5 X1 X5 X6 X7 X8 0.643   0.569   0.486  118    94.2 -1.47    104   1.34  1.27  0.535  0.0462 
212 5 X1 X5 X6 X7 X9 0.627   0.55    0.448  124    95.5 -0.309   105   1.4   1.33  0.559  0.0482 
213 5 X1 X2 X6 X8 X9 0.6     0.517   0.411  135    97.6 1.56    107   1.5   1.43  0.599  0.0517 
214 5 X1 X2 X7 X8 X9 0.599   0.516   0.411  135    97.7 1.62    108   1.5   1.43  0.601  0.0518 
215 5 X1 X6 X7 X8 X9 0.588   0.502   0.359  139    98.5 2.38    108   1.54  1.47  0.618  0.0533 
216 5 X1 X2 X6 X7 X8 0.561   0.469   0.362  150    100   4.13    110   1.64  1.57  0.659  0.0568 
217 5 X1 X2 X6 X7 X9 0.552   0.459   0.329  153    101   4.67    111   1.68  1.6   0.672  0.0579 
218 5 X2 X6 X7 X8 X9 0.487   0.38    0.214  178    105   8.39    115   1.92  1.83  0.769  0.0663 
219 6 X1 X3 X6 X7 X8 X9 0.943   0.928   0.9    5.85 41.3 -38.9     52.5 0.233 0.219 0.092  0.00807
220 6 X3 X5 X6 X7 X8 X9 0.921   0.9     0.873  14.2  51   -34.2     62.2 0.323 0.303 0.127  0.0112 
221 6 X1 X3 X5 X7 X8 X9 0.913   0.89    0.847  17.2  53.9 -32.6     65.1 0.355 0.333 0.14   0.0123 
222 6 X1 X2 X3 X7 X8 X9 0.909   0.885   0.847  18.7  55.2 -31.9     66.4 0.371 0.348 0.146  0.0128 
223 6 X1 X3 X5 X6 X7 X8 0.909   0.885   0.839  18.8  55.3 -31.8     66.5 0.372 0.349 0.147  0.0129 
224 6 X1 X2 X3 X6 X7 X8 0.906   0.881   0.841  20    56.3 -31.2     67.5 0.385 0.361 0.152  0.0133 
225 6 X2 X3 X6 X7 X8 X9 0.905   0.881   0.846  20.1  56.4 -31.2     67.6 0.386 0.362 0.152  0.0133 
226 6 X1 X2 X3 X5 X6 X7 0.9     0.874   0.839  22.1  58   -30.2     69.2 0.407 0.382 0.16   0.0141 
227 6 X1 X3 X5 X6 X7 X9 0.898   0.872   0.824  22.8  58.5 -29.9     69.8 0.415 0.389 0.163  0.0143 
228 6 X1 X2 X3 X6 X7 X9 0.894   0.866   0.825  24.7  59.9 -29       71.2 0.435 0.408 0.171  0.015  
229 6 X1 X2 X3 X5 X7 X8 0.892   0.863   0.82   25.4  60.5 -28.7     71.7 0.443 0.415 0.174  0.0153 
230 6 X1 X2 X3 X6 X8 X9 0.884   0.854   0.814  28.1  62.4 -27.4     73.6 0.472 0.442 0.186  0.0163 
231 6 X1 X2 X3 X5 X7 X9 0.884   0.854   0.811  28.2  62.5 -27.4     73.7 0.473 0.443 0.186  0.0163 
232 6 X2 X3 X5 X6 X7 X8 0.881   0.85    0.803  29.6  63.4 -26.8     74.6 0.487 0.457 0.192  0.0168 
233 6 X2 X3 X5 X7 X8 X9 0.88    0.848   0.797  30    63.7 -26.6     74.9 0.492 0.461 0.194  0.017  
234 6 X1 X3 X5 X6 X8 X9 0.871   0.837   0.783  33.4  65.8 -25.2     77   0.528 0.495 0.208  0.0183 
235 6 X1 X2 X3 X5 X8 X9 0.865   0.83    0.789  35.5  67   -24.3     78.2 0.55  0.516 0.217  0.019  
236 6 X2 X3 X5 X6 X7 X9 0.864   0.828   0.781  36    67.3 -24.1     78.5 0.556 0.521 0.219  0.0192 
237 6 X1 X2 X3 X5 X6 X8 0.859   0.823   0.769  37.7  68.3 -23.4     79.5 0.574 0.538 0.226  0.0198 
238 6 X1 X2 X3 X5 X6 X9 0.847   0.807   0.745  42.5  70.8 -21.6     82   0.625 0.586 0.246  0.0216 
239 6 X2 X3 X5 X6 X8 X9 0.836   0.793   0.739  46.6  72.9 -20.1     84.1 0.669 0.628 0.264  0.0231 
240 6 X2 X5 X6 X7 X8 X9 0.748   0.682   0.563  80.4  85.8 -9.9     97   1.03  0.966 0.406  0.0356 
241 6 X1 X2 X5 X6 X8 X9 0.745   0.679   0.597  81.3  86.1 -9.65    97.3 1.04  0.976 0.41   0.036  
242 6 X1 X5 X6 X7 X8 X9 0.724   0.652   0.528  89.4  88.5 -7.65    99.7 1.13  1.06  0.444  0.0389 
243 6 X1 X2 X5 X7 X8 X9 0.714   0.639   0.548  93.4  89.6 -6.72    101   1.17  1.1   0.461  0.0404 
244 6 X1 X2 X5 X6 X7 X8 0.683   0.6     0.506  105    92.7 -4.08    104   1.3   1.22  0.511  0.0448 
245 6 X1 X2 X5 X6 X7 X9 0.661   0.573   0.461  113    94.7 -2.39    106   1.38  1.3   0.545  0.0478 
246 6 X1 X2 X6 X7 X8 X9 0.604   0.5     0.352  135    99.4 1.73    111   1.62  1.52  0.638  0.0559 
247 7 X1 X2 X3 X6 X7 X8 X9 0.944   0.927   0.897  7.24 42.5 -36.4     55.1 0.249 0.229 0.0961 0.00859
248 7 X1 X3 X5 X6 X7 X8 X9 0.943   0.925   0.892  7.81 43.2 -36.2     55.9 0.255 0.235 0.0986 0.00882
249 7 X2 X3 X5 X6 X7 X8 X9 0.921   0.897   0.864  16    52.8 -32.3     65.4 0.351 0.323 0.136  0.0121 
250 7 X1 X2 X3 X5 X7 X8 X9 0.914   0.887   0.838  18.7  55.4 -31.1     68   0.383 0.352 0.148  0.0132 
251 7 X1 X2 X3 X5 X6 X7 X8 0.91    0.881   0.834  20.5  57   -30.3     69.6 0.403 0.371 0.156  0.0139 
252 7 X1 X2 X3 X5 X6 X7 X9 0.9     0.869   0.822  24.1  60   -28.6     72.6 0.446 0.41  0.172  0.0154 
253 7 X1 X2 X3 X5 X6 X8 X9 0.884   0.848   0.8    30.1  64.4 -26.1     77   0.516 0.475 0.2    0.0178 
254 7 X1 X2 X5 X6 X7 X8 X9 0.748   0.668   0.542  82.2  87.8 -9.31    100   1.13  1.04  0.435  0.0389 
255 8 X1 X2 X3 X5 X6 X7 X8 X9 0.945   0.924   0.89   9    44.1 -33.7     58.1 0.27  0.243 0.102  0.00934
plot(k)

# All Possible Regression for X1 eliminated model #
k <- ols_step_all_possible(model_wf_rm1_log)
k
mindex n predictors rsquare adjr predrsq cp aic sbic sbc msep fpe apc hsp
1 1 X4 0.803   0.796   0.772  52.6  68.4 -20     72.6 0.538 0.536 0.225  0.0186 
2 1 X5 0.523   0.506   0.472  164    94.9 5.02  99.2 1.3   1.3   0.545  0.0451 
3 1 X2 0.433   0.413   0.346  200    100   10     104   1.55  1.54  0.648  0.0535 
4 1 X7 0.35    0.327   0.272  233    104   14     108   1.78  1.77  0.743  0.0614 
5 1 X3 0.227   0.199   0.133  283    109   19.1   114   2.11  2.1   0.884  0.073  
6 1 X8 0.0407  0.00648 -0.0793 357    116   25.4   120   2.62  2.61  1.1    0.0906 
7 1 X9 0.0176  -0.0175  -0.104  366    117   26.1   121   2.68  2.67  1.12   0.0928 
8 1 X6 0.00292 -0.0327  -0.125  372    117   26.5   121   2.72  2.71  1.14   0.0942 
9 2 X3 X4 0.873   0.864   0.844  26.6  57.2 -30.7   62.8 0.373 0.369 0.155  0.0129 
10 2 X3 X7 0.832   0.819   0.798  43.2  65.7 -23.4   71.3 0.495 0.49  0.206  0.0171 
11 2 X4 X7 0.817   0.804   0.774  48.9  68.2 -21.2   73.8 0.538 0.532 0.223  0.0186 
12 2 X4 X9 0.807   0.793   0.758  53.1  69.8 -19.7   75.4 0.568 0.562 0.236  0.0196 
13 2 X4 X8 0.806   0.792   0.759  53.4  69.9 -19.6   75.6 0.571 0.564 0.237  0.0197 
14 2 X4 X5 0.805   0.79    0.763  53.9  70.1 -19.5   75.7 0.574 0.568 0.239  0.0198 
15 2 X2 X4 0.804   0.79    0.764  54.1  70.2 -19.4   75.8 0.576 0.569 0.239  0.0199 
16 2 X4 X6 0.803   0.788   0.756  54.6  70.4 -19.2   76   0.58  0.573 0.241  0.02   
17 2 X2 X3 0.709   0.688   0.644  91.9  82.1 -8.61  87.7 0.855 0.845 0.355  0.0295 
18 2 X2 X5 0.646   0.62    0.592  117    88   -3.12  93.6 1.04  1.03  0.433  0.036  
19 2 X5 X7 0.6     0.57    0.536  136    91.6 0.323 97.3 1.18  1.16  0.489  0.0407 
20 2 X3 X5 0.564   0.531   0.489  150    94.3 2.8   99.9 1.28  1.27  0.533  0.0444 
21 2 X5 X8 0.553   0.52    0.479  154    95   3.49  101   1.32  1.3   0.547  0.0455 
22 2 X5 X6 0.544   0.511   0.451  158    95.6 4.02  101   1.34  1.32  0.557  0.0463 
23 2 X5 X9 0.526   0.491   0.439  165    96.7 5.14  102   1.39  1.38  0.579  0.0482 
24 2 X2 X8 0.451   0.41    0.335  195    101   9.35  107   1.62  1.6   0.671  0.0558 
25 2 X2 X7 0.44    0.398   0.318  200    102   9.95  107   1.65  1.63  0.685  0.057  
26 2 X2 X9 0.435   0.393   0.301  201    102   10.2   108   1.66  1.64  0.69   0.0574 
27 2 X2 X6 0.433   0.391   0.309  202    102   10.3   108   1.67  1.65  0.693  0.0576 
28 2 X7 X8 0.361   0.314   0.249  231    106   13.7   111   1.88  1.86  0.781  0.065  
29 2 X6 X7 0.35    0.302   0.224  235    106   14.2   112   1.91  1.89  0.794  0.0661 
30 2 X7 X9 0.35    0.302   0.224  235    106   14.2   112   1.91  1.89  0.794  0.0661 
31 2 X3 X8 0.282   0.229   0.133  262    109   17.1   115   2.11  2.09  0.877  0.073  
32 2 X3 X9 0.265   0.21    0.107  269    110   17.8   116   2.16  2.14  0.899  0.0748 
33 2 X3 X6 0.227   0.17    0.0769 284    111   19.2   117   2.27  2.25  0.945  0.0786 
34 2 X8 X9 0.051   -0.0193  -0.168  355    118   25.2   123   2.79  2.76  1.16   0.0965 
35 2 X6 X8 0.0423  -0.0286  -0.138  358    118   25.5   123   2.82  2.79  1.17   0.0974 
36 2 X6 X9 0.0207  -0.0519  -0.162  367    119   26.1   124   2.88  2.85  1.2    0.0996 
37 3 X3 X4 X7 0.89    0.878   0.854  21.7  54.8 -33     61.8 0.348 0.341 0.143  0.012  
38 3 X3 X4 X8 0.881   0.867   0.837  25.7  57.4 -31     64.4 0.38  0.372 0.156  0.0131 
39 3 X2 X3 X4 0.88    0.866   0.848  26.1  57.6 -30.8   64.7 0.383 0.375 0.157  0.0132 
40 3 X3 X4 X5 0.876   0.862   0.836  27.5  58.5 -30.1   65.5 0.394 0.386 0.162  0.0136 
41 3 X3 X4 X6 0.874   0.86    0.834  28.1  58.9 -29.8   65.9 0.399 0.391 0.164  0.0138 
42 3 X3 X4 X9 0.873   0.859   0.829  28.6  59.2 -29.5   66.2 0.403 0.394 0.166  0.0139 
43 3 X4 X8 X9 0.862   0.846   0.808  33.1  61.8 -27.4   68.8 0.439 0.43  0.181  0.0152 
44 3 X3 X6 X7 0.855   0.838   0.809  35.9  63.2 -26.2   70.2 0.461 0.452 0.19   0.0159 
45 3 X3 X7 X8 0.845   0.828   0.8    39.7  65.1 -24.6   72.1 0.491 0.481 0.202  0.017  
46 3 X2 X3 X7 0.835   0.816   0.783  43.8  67.1 -23     74.1 0.524 0.513 0.216  0.0181 
47 3 X3 X5 X7 0.835   0.815   0.789  44    67.2 -22.9   74.2 0.526 0.515 0.216  0.0182 
48 3 X3 X7 X9 0.833   0.813   0.788  44.7  67.5 -22.6   74.5 0.532 0.521 0.219  0.0184 
49 3 X4 X7 X8 0.821   0.801   0.76   49.4  69.5 -20.9   76.5 0.568 0.557 0.234  0.0196 
50 3 X4 X7 X9 0.82    0.799   0.759  50    69.8 -20.7   76.8 0.574 0.562 0.236  0.0198 
51 3 X2 X4 X7 0.819   0.799   0.765  50.1  69.8 -20.7   76.8 0.574 0.562 0.236  0.0198 
52 3 X4 X6 X7 0.817   0.796   0.759  50.8  70.1 -20.4   77.1 0.58  0.568 0.239  0.0201 
53 3 X4 X5 X7 0.817   0.796   0.759  50.8  70.1 -20.4   77.1 0.58  0.568 0.239  0.0201 
54 3 X4 X5 X8 0.809   0.787   0.752  54.4  71.5 -19.2   78.6 0.608 0.596 0.25   0.021  
55 3 X2 X4 X9 0.808   0.786   0.748  54.5  71.6 -19.2   78.6 0.609 0.596 0.251  0.021  
56 3 X4 X5 X9 0.808   0.786   0.749  54.5  71.6 -19.1   78.6 0.61  0.597 0.251  0.0211 
57 3 X2 X4 X8 0.807   0.785   0.75   54.9  71.8 -19     78.8 0.613 0.6   0.252  0.0212 
58 3 X4 X6 X9 0.807   0.785   0.741  55.1  71.8 -19     78.8 0.614 0.601 0.253  0.0212 
59 3 X4 X6 X8 0.806   0.784   0.742  55.4  71.9 -18.8   79   0.616 0.604 0.254  0.0213 
60 3 X2 X4 X5 0.806   0.783   0.756  55.6  72   -18.8   79   0.618 0.605 0.254  0.0214 
61 3 X4 X5 X6 0.805   0.783   0.747  55.8  72.1 -18.7   79.1 0.62  0.607 0.255  0.0214 
62 3 X2 X4 X6 0.804   0.782   0.749  56.1  72.2 -18.6   79.2 0.622 0.609 0.256  0.0215 
63 3 X2 X3 X5 0.761   0.734   0.7    73.2  78.2 -13.4   85.2 0.758 0.743 0.312  0.0262 
64 3 X2 X3 X8 0.737   0.707   0.649  82.8  81.1 -10.8   88.1 0.835 0.818 0.344  0.0289 
65 3 X2 X3 X9 0.721   0.689   0.62   89.2  82.8 -9.23  89.8 0.885 0.867 0.364  0.0306 
66 3 X2 X3 X6 0.714   0.681   0.623  92    83.6 -8.56  90.6 0.908 0.889 0.374  0.0314 
67 3 X2 X5 X8 0.665   0.627   0.594  111    88.3 -4.24  95.3 1.06  1.04  0.437  0.0367 
68 3 X2 X5 X6 0.66    0.621   0.575  113    88.7 -3.84  95.7 1.08  1.06  0.444  0.0373 
69 3 X2 X5 X9 0.647   0.606   0.557  119    89.9 -2.74  96.9 1.12  1.1   0.462  0.0388 
70 3 X2 X5 X7 0.646   0.605   0.551  119    90   -2.69  97   1.12  1.1   0.463  0.0389 
71 3 X5 X6 X7 0.62    0.577   0.516  129    92.1 -0.755 99.1 1.21  1.18  0.496  0.0417 
72 3 X5 X7 X8 0.616   0.572   0.531  131    92.4 -0.457 99.4 1.22  1.19  0.502  0.0421 
73 3 X3 X5 X8 0.601   0.555   0.503  137    93.6 0.649 101   1.27  1.24  0.522  0.0438 
74 3 X5 X7 X9 0.6     0.554   0.502  138    93.6 0.698 101   1.27  1.24  0.523  0.0439 
75 3 X5 X8 X9 0.598   0.551   0.488  139    93.8 0.876 101   1.28  1.25  0.526  0.0442 
76 3 X3 X5 X6 0.588   0.54    0.466  142    94.6 1.54  102   1.31  1.28  0.539  0.0453 
77 3 X5 X6 X8 0.578   0.529   0.465  146    95.3 2.22  102   1.34  1.31  0.552  0.0464 
78 3 X3 X5 X9 0.573   0.524   0.461  148    95.6 2.54  103   1.36  1.33  0.558  0.0469 
79 3 X5 X6 X9 0.547   0.495   0.415  159    97.4 4.19  104   1.44  1.41  0.592  0.0498 
80 3 X2 X8 X9 0.476   0.415   0.295  187    102   8.31  109   1.67  1.63  0.686  0.0576 
81 3 X2 X7 X8 0.455   0.392   0.3    196    103   9.44  110   1.73  1.7   0.713  0.0599 
82 3 X2 X6 X8 0.451   0.388   0.295  197    103   9.62  110   1.74  1.71  0.718  0.0603 
83 3 X2 X7 X9 0.44    0.376   0.271  201    104   10.2   111   1.78  1.74  0.732  0.0615 
84 3 X2 X6 X7 0.44    0.375   0.27   202    104   10.2   111   1.78  1.74  0.733  0.0616 
85 3 X2 X6 X9 0.435   0.37    0.259  203    104   10.4   111   1.79  1.76  0.738  0.062  
86 3 X7 X8 X9 0.408   0.34    0.216  214    105   11.8   112   1.88  1.84  0.774  0.065  
87 3 X6 X7 X8 0.361   0.288   0.199  233    108   13.9   115   2.03  1.99  0.835  0.0701 
88 3 X6 X7 X9 0.35    0.276   0.168  237    108   14.4   115   2.06  2.02  0.849  0.0713 
89 3 X3 X6 X8 0.283   0.201   0.0775 264    111   17.2   118   2.28  2.23  0.937  0.0787 
90 3 X3 X8 X9 0.283   0.2     0.0776 264    111   17.3   118   2.28  2.23  0.937  0.0787 
91 3 X3 X6 X9 0.265   0.18    0.047  271    112   18     119   2.34  2.29  0.961  0.0807 
92 3 X6 X8 X9 0.0515  -0.0579  -0.242  356    120   25.4   127   3.01  2.95  1.24   0.104  
93 4 X3 X4 X8 X9 0.916   0.902   0.879  13.7  49   -37.3   57.4 0.291 0.281 0.118  0.01   
94 4 X3 X4 X7 X8 0.898   0.882   0.85   20.5  54.5 -33.3   62.9 0.35  0.338 0.142  0.0121 
95 4 X3 X4 X6 X7 0.897   0.881   0.854  20.9  54.8 -33.1   63.2 0.353 0.342 0.144  0.0122 
96 4 X3 X4 X5 X7 0.894   0.877   0.848  22.3  55.9 -32.4   64.3 0.365 0.354 0.149  0.0126 
97 4 X2 X3 X4 X7 0.891   0.873   0.847  23.6  56.8 -31.7   65.2 0.377 0.364 0.153  0.013  
98 4 X3 X4 X7 X9 0.89    0.873   0.842  23.7  56.8 -31.7   65.2 0.377 0.365 0.153  0.013  
99 4 X2 X3 X4 X8 0.888   0.87    0.843  24.6  57.4 -31.2   65.8 0.385 0.372 0.156  0.0133 
100 4 X3 X4 X5 X8 0.882   0.864   0.828  26.9  58.9 -30.1   67.3 0.405 0.392 0.165  0.014  
101 4 X3 X4 X6 X8 0.882   0.863   0.826  27    59   -30     67.4 0.405 0.392 0.165  0.014  
102 4 X2 X3 X4 X6 0.881   0.862   0.838  27.3  59.2 -29.9   67.6 0.408 0.395 0.166  0.0141 
103 4 X2 X3 X4 X5 0.881   0.862   0.838  27.3  59.2 -29.9   67.6 0.408 0.395 0.166  0.0141 
104 4 X2 X3 X4 X9 0.88    0.86    0.831  28.1  59.6 -29.5   68.1 0.415 0.401 0.169  0.0143 
105 4 X3 X4 X5 X6 0.876   0.856   0.824  29.4  60.5 -28.9   68.9 0.426 0.412 0.173  0.0147 
106 4 X3 X4 X5 X9 0.876   0.856   0.821  29.4  60.5 -28.9   68.9 0.426 0.413 0.173  0.0147 
107 4 X3 X4 X6 X9 0.874   0.854   0.819  30.1  60.9 -28.6   69.3 0.432 0.418 0.176  0.0149 
108 4 X3 X6 X7 X8 0.871   0.85    0.814  31.6  61.8 -27.9   70.2 0.445 0.431 0.181  0.0154 
109 4 X4 X7 X8 X9 0.871   0.85    0.811  31.6  61.8 -27.9   70.2 0.445 0.431 0.181  0.0154 
110 4 X3 X7 X8 X9 0.869   0.848   0.817  32.2  62.1 -27.6   70.5 0.45  0.436 0.183  0.0156 
111 4 X4 X5 X8 X9 0.866   0.844   0.801  33.5  62.9 -27     71.3 0.462 0.447 0.188  0.016  
112 4 X2 X4 X8 X9 0.864   0.842   0.8    34.3  63.3 -26.7   71.7 0.469 0.454 0.191  0.0162 
113 4 X4 X6 X8 X9 0.864   0.842   0.796  34.4  63.4 -26.7   71.8 0.469 0.454 0.191  0.0162 
114 4 X3 X5 X6 X7 0.861   0.839   0.802  35.4  63.9 -26.2   72.3 0.478 0.463 0.194  0.0165 
115 4 X2 X3 X6 X7 0.856   0.833   0.8    37.3  64.9 -25.4   73.3 0.494 0.479 0.201  0.0171 
116 4 X3 X6 X7 X9 0.856   0.833   0.797  37.6  65.1 -25.3   73.5 0.497 0.481 0.202  0.0172 
117 4 X2 X3 X7 X8 0.85    0.826   0.784  40    66.3 -24.3   74.7 0.517 0.501 0.21   0.0179 
118 4 X3 X5 X7 X8 0.849   0.825   0.793  40.3  66.5 -24.2   74.9 0.52  0.504 0.212  0.018  
119 4 X2 X3 X5 X7 0.839   0.813   0.773  44.4  68.4 -22.6   76.9 0.556 0.538 0.226  0.0192 
120 4 X2 X3 X7 X9 0.837   0.81    0.771  45.2  68.8 -22.3   77.2 0.563 0.544 0.229  0.0194 
121 4 X3 X5 X7 X9 0.835   0.809   0.779  45.7  69   -22.1   77.4 0.567 0.548 0.23   0.0196 
122 4 X2 X4 X7 X8 0.824   0.796   0.748  50.2  71   -20.5   79.4 0.606 0.587 0.246  0.0209 
123 4 X4 X5 X7 X8 0.822   0.793   0.745  51.1  71.4 -20.2   79.8 0.614 0.594 0.25   0.0212 
124 4 X4 X6 X7 X8 0.821   0.793   0.742  51.3  71.5 -20.1   79.9 0.615 0.595 0.25   0.0213 
125 4 X2 X4 X7 X9 0.821   0.792   0.742  51.4  71.5 -20.1   79.9 0.616 0.596 0.25   0.0213 
126 4 X2 X4 X5 X7 0.82    0.791   0.747  51.9  71.7 -19.9   80.1 0.62  0.6   0.252  0.0214 
127 4 X4 X6 X7 X9 0.82    0.791   0.742  51.9  71.8 -19.9   80.2 0.621 0.601 0.252  0.0215 
128 4 X2 X4 X6 X7 0.82    0.791   0.747  52    71.8 -19.9   80.2 0.621 0.601 0.252  0.0215 
129 4 X4 X5 X7 X9 0.82    0.791   0.743  52    71.8 -19.9   80.2 0.621 0.601 0.253  0.0215 
130 4 X4 X5 X6 X7 0.818   0.788   0.741  52.8  72.1 -19.6   80.5 0.628 0.608 0.255  0.0217 
131 4 X2 X4 X5 X9 0.809   0.779   0.738  56.2  73.5 -18.5   81.9 0.657 0.636 0.267  0.0227 
132 4 X4 X5 X6 X8 0.809   0.779   0.734  56.2  73.5 -18.5   81.9 0.657 0.636 0.267  0.0227 
133 4 X2 X4 X5 X8 0.809   0.779   0.743  56.2  73.5 -18.5   81.9 0.657 0.636 0.267  0.0227 
134 4 X4 X5 X6 X9 0.808   0.778   0.73   56.5  73.6 -18.4   82   0.66  0.638 0.268  0.0228 
135 4 X2 X4 X6 X9 0.808   0.778   0.731  56.5  73.6 -18.4   82   0.66  0.638 0.268  0.0228 
136 4 X2 X4 X6 X8 0.807   0.776   0.733  56.9  73.8 -18.2   82.2 0.664 0.642 0.27   0.0229 
137 4 X2 X4 X5 X6 0.806   0.775   0.738  57.6  74   -18     82.4 0.669 0.648 0.272  0.0231 
138 4 X2 X3 X5 X8 0.788   0.754   0.711  64.8  76.7 -15.8   85.1 0.731 0.708 0.297  0.0253 
139 4 X2 X3 X5 X6 0.777   0.742   0.689  68.9  78.1 -14.5   86.5 0.767 0.742 0.312  0.0265 
140 4 X2 X3 X5 X9 0.768   0.731   0.675  72.6  79.3 -13.5   87.7 0.799 0.773 0.325  0.0276 
141 4 X2 X3 X8 X9 0.744   0.703   0.639  82.1  82.2 -10.9   90.7 0.881 0.852 0.358  0.0304 
142 4 X2 X3 X6 X8 0.744   0.703   0.628  82.1  82.3 -10.9   90.7 0.881 0.853 0.358  0.0305 
143 4 X2 X3 X6 X9 0.726   0.682   0.594  89.2  84.3 -9.16  92.7 0.943 0.912 0.383  0.0326 
144 4 X2 X5 X8 X9 0.712   0.666   0.613  94.7  85.8 -7.86  94.2 0.99  0.958 0.403  0.0342 
145 4 X2 X5 X6 X8 0.682   0.632   0.58   107    88.7 -5.19  97.1 1.09  1.06  0.445  0.0378 
146 4 X5 X7 X8 X9 0.679   0.628   0.561  108    89.1 -4.91  97.5 1.11  1.07  0.449  0.0382 
147 4 X2 X5 X7 X8 0.666   0.612   0.549  113    90.3 -3.81  98.7 1.15  1.11  0.468  0.0398 
148 4 X2 X5 X6 X9 0.661   0.607   0.536  115    90.7 -3.43  99.1 1.17  1.13  0.475  0.0404 
149 4 X2 X5 X6 X7 0.661   0.606   0.527  115    90.7 -3.41  99.1 1.17  1.13  0.475  0.0404 
150 4 X2 X5 X7 X9 0.647   0.59    0.515  121    91.9 -2.31  100   1.22  1.18  0.495  0.042  
151 4 X5 X6 X8 X9 0.641   0.583   0.484  123    92.4 -1.87  101   1.24  1.2   0.503  0.0427 
152 4 X5 X6 X7 X8 0.639   0.582   0.515  124    92.6 -1.75  101   1.24  1.2   0.505  0.0429 
153 4 X3 X5 X6 X8 0.629   0.57    0.49   128    93.4 -1.01  102   1.28  1.23  0.519  0.0441 
154 4 X3 X5 X8 X9 0.628   0.569   0.513  128    93.5 -0.934 102   1.28  1.24  0.52   0.0442 
155 4 X5 X6 X7 X9 0.621   0.56    0.474  131    94.1 -0.365 102   1.31  1.26  0.531  0.0452 
156 4 X3 X5 X6 X9 0.596   0.531   0.439  141    95.9 1.35  104   1.39  1.35  0.565  0.0481 
157 4 X2 X7 X8 X9 0.485   0.403   0.269  185    103   8.08  112   1.77  1.71  0.72   0.0612 
158 4 X2 X6 X8 X9 0.476   0.393   0.251  189    104   8.57  112   1.8   1.74  0.733  0.0623 
159 4 X2 X6 X7 X8 0.455   0.367   0.25   198    105   9.71  113   1.88  1.82  0.763  0.0649 
160 4 X2 X6 X7 X9 0.44    0.351   0.216  203    106   10.4   114   1.93  1.86  0.784  0.0666 
161 4 X6 X7 X8 X9 0.412   0.318   0.158  215    107   11.8   116   2.02  1.96  0.823  0.07   
162 4 X3 X6 X8 X9 0.285   0.17    0.0102 265    113   17.4   121   2.46  2.38  1      0.0851 
163 5 X3 X4 X7 X8 X9 0.932   0.918   0.892  9.14 44.5 -39.4   54.3 0.255 0.243 0.102  0.0088 
164 5 X3 X4 X6 X8 X9 0.921   0.904   0.877  13.7  49.2 -36.6   59   0.297 0.284 0.119  0.0103 
165 5 X2 X3 X4 X8 X9 0.919   0.903   0.879  14.1  49.6 -36.4   59.4 0.301 0.287 0.121  0.0104 
166 5 X3 X4 X5 X8 X9 0.916   0.898   0.868  15.5  50.8 -35.6   60.7 0.314 0.3   0.126  0.0109 
167 5 X3 X4 X6 X7 X8 0.907   0.888   0.851  19    53.8 -33.7   63.6 0.347 0.331 0.139  0.012  
168 5 X3 X6 X7 X8 X9 0.905   0.885   0.856  19.9  54.5 -33.2   64.3 0.355 0.339 0.142  0.0123 
169 5 X3 X4 X5 X7 X8 0.901   0.88    0.842  21.5  55.8 -32.4   65.6 0.371 0.354 0.149  0.0128 
170 5 X2 X3 X4 X7 X8 0.899   0.878   0.842  22.3  56.4 -32     66.2 0.378 0.36  0.151  0.0131 
171 5 X3 X4 X5 X6 X7 0.899   0.877   0.843  22.5  56.5 -31.9   66.3 0.38  0.362 0.152  0.0131 
172 5 X2 X3 X4 X6 X7 0.898   0.876   0.848  22.9  56.8 -31.7   66.6 0.383 0.366 0.154  0.0133 
173 5 X3 X4 X6 X7 X9 0.897   0.876   0.842  22.9  56.8 -31.7   66.6 0.384 0.366 0.154  0.0133 
174 5 X3 X4 X5 X7 X9 0.894   0.872   0.835  24.3  57.8 -31     67.6 0.397 0.379 0.159  0.0137 
175 5 X2 X3 X4 X5 X7 0.894   0.872   0.841  24.3  57.8 -31     67.7 0.397 0.379 0.159  0.0137 
176 5 X2 X3 X4 X6 X8 0.891   0.868   0.832  25.5  58.7 -30.4   68.5 0.408 0.389 0.164  0.0141 
177 5 X2 X3 X4 X7 X9 0.891   0.868   0.833  25.6  58.8 -30.3   68.6 0.409 0.39  0.164  0.0141 
178 5 X2 X3 X4 X5 X8 0.889   0.866   0.832  26.2  59.1 -30.1   68.9 0.414 0.395 0.166  0.0143 
179 5 X3 X4 X5 X6 X8 0.883   0.859   0.815  28.6  60.7 -28.9   70.5 0.437 0.417 0.175  0.0151 
180 5 X2 X3 X4 X5 X6 0.882   0.858   0.827  29    61   -28.8   70.8 0.441 0.42  0.177  0.0152 
181 5 X2 X3 X4 X6 X9 0.881   0.857   0.819  29.3  61.2 -28.6   71   0.444 0.423 0.178  0.0153 
182 5 X2 X3 X4 X5 X9 0.881   0.857   0.82   29.3  61.2 -28.6   71   0.444 0.423 0.178  0.0153 
183 5 X3 X5 X6 X7 X8 0.878   0.853   0.811  30.6  62   -28     71.8 0.456 0.435 0.183  0.0158 
184 5 X3 X5 X7 X8 X9 0.877   0.851   0.82   31.2  62.4 -27.8   72.2 0.462 0.44  0.185  0.016  
185 5 X3 X4 X5 X6 X9 0.876   0.851   0.807  31.3  62.4 -27.7   72.2 0.462 0.441 0.185  0.016  
186 5 X2 X3 X6 X7 X8 0.873   0.846   0.804  32.8  63.3 -27.1   73.1 0.476 0.454 0.191  0.0165 
187 5 X4 X5 X7 X8 X9 0.872   0.846   0.799  32.9  63.4 -27     73.2 0.477 0.455 0.191  0.0165 
188 5 X4 X6 X7 X8 X9 0.872   0.845   0.797  33.2  63.6 -26.9   73.4 0.48  0.458 0.193  0.0166 
189 5 X2 X3 X7 X8 X9 0.872   0.845   0.798  33.3  63.6 -26.9   73.4 0.481 0.459 0.193  0.0166 
190 5 X4 X5 X6 X8 X9 0.871   0.844   0.797  33.4  63.6 -26.8   73.5 0.482 0.46  0.193  0.0166 
191 5 X2 X4 X7 X8 X9 0.871   0.844   0.795  33.5  63.7 -26.8   73.5 0.483 0.461 0.194  0.0167 
192 5 X2 X4 X5 X8 X9 0.867   0.839   0.792  35.2  64.7 -26.1   74.5 0.499 0.476 0.2    0.0172 
193 5 X2 X4 X6 X8 X9 0.866   0.838   0.788  35.6  64.9 -25.9   74.7 0.503 0.48  0.202  0.0174 
194 5 X2 X3 X5 X6 X7 0.863   0.834   0.794  36.7  65.5 -25.4   75.3 0.513 0.489 0.206  0.0177 
195 5 X3 X5 X6 X7 X9 0.862   0.833   0.79   37.2  65.8 -25.2   75.6 0.518 0.494 0.207  0.0179 
196 5 X2 X3 X6 X7 X9 0.857   0.828   0.787  38.9  66.7 -24.5   76.5 0.534 0.509 0.214  0.0184 
197 5 X2 X3 X5 X7 X8 0.854   0.823   0.776  40.3  67.4 -24     77.3 0.547 0.522 0.219  0.0189 
198 5 X2 X3 X5 X7 X9 0.84    0.807   0.76   45.9  70.2 -21.9   80   0.599 0.572 0.24   0.0207 
199 5 X2 X4 X5 X7 X8 0.825   0.788   0.73   51.8  72.9 -19.8   82.7 0.655 0.625 0.263  0.0226 
200 5 X2 X4 X6 X7 X8 0.824   0.787   0.725  52.2  73   -19.7   82.8 0.658 0.628 0.264  0.0227 
201 5 X4 X5 X6 X7 X8 0.822   0.785   0.724  53.1  73.4 -19.3   83.2 0.667 0.636 0.267  0.0231 
202 5 X2 X4 X5 X7 X9 0.821   0.784   0.723  53.3  73.5 -19.3   83.3 0.668 0.638 0.268  0.0231 
203 5 X2 X4 X6 X7 X9 0.821   0.784   0.722  53.3  73.5 -19.3   83.3 0.669 0.638 0.268  0.0231 
204 5 X2 X4 X5 X6 X7 0.82    0.782   0.728  53.8  73.7 -19.1   83.5 0.674 0.643 0.27   0.0233 
205 5 X4 X5 X6 X7 X9 0.82    0.782   0.724  53.9  73.8 -19.1   83.6 0.675 0.644 0.27   0.0233 
206 5 X2 X4 X5 X6 X8 0.809   0.77    0.723  58    75.4 -17.7   85.2 0.713 0.68  0.286  0.0246 
207 5 X2 X4 X5 X6 X9 0.809   0.769   0.718  58.1  75.4 -17.7   85.3 0.714 0.681 0.286  0.0247 
208 5 X2 X3 X5 X8 X9 0.807   0.766   0.726  59.1  75.8 -17.4   85.7 0.723 0.69  0.29   0.025  
209 5 X2 X3 X5 X6 X8 0.807   0.766   0.706  59.2  75.9 -17.4   85.7 0.724 0.691 0.29   0.025  
210 5 X2 X3 X5 X6 X9 0.783   0.738   0.664  68.5  79.3 -14.5   89.1 0.811 0.774 0.325  0.028  
211 5 X2 X3 X6 X8 X9 0.753   0.702   0.622  80.4  83.1 -11.3   92.9 0.923 0.88  0.37   0.0319 
212 5 X2 X5 X6 X8 X9 0.745   0.692   0.612  83.7  84.1 -10.4   94   0.954 0.91  0.382  0.033  
213 5 X5 X6 X7 X8 X9 0.723   0.665   0.559  92.6  86.7 -8.29  96.5 1.04  0.989 0.416  0.0358 
214 5 X2 X5 X7 X8 X9 0.713   0.654   0.57   96.3  87.7 -7.42  97.5 1.07  1.02  0.43   0.0371 
215 5 X2 X5 X6 X7 X8 0.682   0.616   0.527  109    90.7 -4.73  101   1.19  1.13  0.476  0.0411 
216 5 X3 X5 X6 X8 X9 0.672   0.603   0.512  113    91.7 -3.84  102   1.23  1.17  0.493  0.0425 
217 5 X2 X5 X6 X7 X9 0.661   0.59    0.483  117    92.7 -2.99  103   1.27  1.21  0.509  0.0438 
218 5 X2 X6 X7 X8 X9 0.487   0.38    0.214  187    105   8.29  115   1.92  1.83  0.769  0.0663 
219 6 X3 X4 X6 X7 X8 X9 0.947   0.933   0.908  5.24 39.2 -40.5   50.4 0.217 0.204 0.0857 0.00751
220 6 X3 X4 X5 X7 X8 X9 0.933   0.915   0.883  10.9  46.2 -37.3   57.4 0.275 0.258 0.108  0.0095 
221 6 X2 X3 X4 X7 X8 X9 0.932   0.914   0.883  11.1  46.5 -37.2   57.7 0.278 0.26  0.109  0.00959
222 6 X2 X3 X4 X6 X8 X9 0.925   0.906   0.877  13.8  49.3 -35.7   60.5 0.305 0.286 0.12   0.0105 
223 6 X3 X5 X6 X7 X8 X9 0.921   0.9     0.873  15.5  51   -34.8   62.2 0.323 0.303 0.127  0.0112 
224 6 X3 X4 X5 X6 X8 X9 0.921   0.9     0.869  15.7  51.2 -34.7   62.4 0.324 0.304 0.128  0.0112 
225 6 X2 X3 X4 X5 X8 X9 0.92    0.899   0.866  16.1  51.5 -34.5   62.7 0.328 0.308 0.129  0.0113 
226 6 X3 X4 X5 X6 X7 X8 0.907   0.883   0.838  20.9  55.7 -32.1   66.9 0.378 0.354 0.149  0.0131 
227 6 X2 X3 X4 X6 X7 X8 0.907   0.883   0.843  20.9  55.8 -32.1   67   0.378 0.354 0.149  0.0131 
228 6 X2 X3 X6 X7 X8 X9 0.905   0.881   0.846  21.7  56.4 -31.7   67.6 0.386 0.362 0.152  0.0133 
229 6 X2 X3 X4 X5 X7 X8 0.901   0.875   0.834  23.4  57.7 -30.9   68.9 0.404 0.378 0.159  0.0139 
230 6 X3 X4 X5 X6 X7 X9 0.899   0.872   0.829  24.5  58.5 -30.4   69.7 0.414 0.388 0.163  0.0143 
231 6 X2 X3 X4 X5 X6 X7 0.899   0.872   0.837  24.5  58.5 -30.4   69.7 0.414 0.388 0.163  0.0143 
232 6 X2 X3 X4 X6 X7 X9 0.898   0.871   0.833  24.9  58.8 -30.2   70   0.418 0.392 0.165  0.0145 
233 6 X2 X3 X4 X5 X7 X9 0.894   0.866   0.826  26.3  59.8 -29.5   71   0.433 0.406 0.171  0.015  
234 6 X2 X3 X4 X5 X6 X8 0.891   0.863   0.82   27.5  60.6 -29     71.8 0.445 0.417 0.175  0.0154 
235 6 X2 X3 X4 X5 X6 X9 0.882   0.852   0.806  31    63   -27.5   74.2 0.481 0.451 0.189  0.0166 
236 6 X2 X3 X5 X6 X7 X8 0.881   0.85    0.803  31.6  63.4 -27.2   74.6 0.487 0.457 0.192  0.0168 
237 6 X2 X3 X5 X7 X8 X9 0.88    0.848   0.797  32.1  63.7 -27     74.9 0.492 0.461 0.194  0.017  
238 6 X4 X5 X6 X7 X8 X9 0.875   0.843   0.787  33.7  64.7 -26.3   75.9 0.509 0.477 0.2    0.0176 
239 6 X2 X4 X5 X7 X8 X9 0.873   0.84    0.78   34.7  65.2 -25.9   76.4 0.518 0.486 0.204  0.0179 
240 6 X2 X4 X6 X7 X8 X9 0.872   0.838   0.778  35.1  65.5 -25.8   76.7 0.523 0.491 0.206  0.0181 
241 6 X2 X4 X5 X6 X8 X9 0.872   0.838   0.786  35.2  65.6 -25.7   76.8 0.524 0.492 0.207  0.0181 
242 6 X2 X3 X5 X6 X7 X9 0.864   0.828   0.781  38.3  67.3 -24.5   78.5 0.556 0.521 0.219  0.0192 
243 6 X2 X3 X5 X6 X8 X9 0.836   0.793   0.739  49.4  72.9 -20.4   84.1 0.669 0.628 0.264  0.0231 
244 6 X2 X4 X5 X6 X7 X8 0.825   0.779   0.707  53.8  74.9 -18.9   86.1 0.715 0.67  0.282  0.0247 
245 6 X2 X4 X5 X6 X7 X9 0.821   0.775   0.702  55.2  75.5 -18.5   86.7 0.729 0.684 0.287  0.0252 
246 6 X2 X5 X6 X7 X8 X9 0.748   0.682   0.563  84.7  85.8 -10.1   97   1.03  0.966 0.406  0.0356 
247 7 X3 X4 X5 X6 X7 X8 X9 0.947   0.93    0.902  7.06 40.9 -37.8   53.5 0.236 0.217 0.0912 0.00816
248 7 X2 X3 X4 X6 X7 X8 X9 0.947   0.93    0.902  7.15 41   -37.8   53.6 0.237 0.218 0.0916 0.00819
249 7 X2 X3 X4 X5 X7 X8 X9 0.933   0.911   0.874  12.8  48.2 -35.1   60.8 0.301 0.276 0.116  0.0104 
250 7 X2 X3 X4 X5 X6 X8 X9 0.926   0.902   0.868  15.7  51.2 -33.8   63.8 0.333 0.306 0.128  0.0115 
251 7 X2 X3 X5 X6 X7 X8 X9 0.921   0.897   0.864  17.3  52.8 -33     65.4 0.351 0.323 0.136  0.0121 
252 7 X2 X3 X4 X5 X6 X7 X8 0.908   0.878   0.83   22.8  57.7 -30.5   70.3 0.413 0.38  0.159  0.0143 
253 7 X2 X3 X4 X5 X6 X7 X9 0.899   0.866   0.821  26.5  60.5 -28.9   73.1 0.454 0.417 0.175  0.0157 
254 7 X2 X4 X5 X6 X7 X8 X9 0.876   0.837   0.768  35.5  66.5 -25.2   79.1 0.554 0.51  0.214  0.0192 
255 8 X2 X3 X4 X5 X6 X7 X8 X9 0.947   0.927   0.896  9    42.8 -35     56.8 0.259 0.233 0.0977 0.00895
plot(k)

# Stepwise Regression based on p values for full model#
k <- ols_step_both_p(model_wf_full_log)
## Stepwise Selection Method   
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X1 
## 2. X2 
## 3. X3 
## 4. X4 
## 5. X5 
## 6. X6 
## 7. X7 
## 8. X8 
## 9. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered/Removed: 
## 
## - X4 added 
## - X3 added 
## - X7 added 
## 
## No more variables to be added/removed.
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.944       RMSE               0.549 
## R-Squared               0.890       Coef. Var          8.618 
## Adj. R-Squared          0.878       MSE                0.301 
## Pred R-Squared          0.854       MAE                0.414 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     63.565         3         21.188    70.378    0.0000 
## Residual        7.828        26          0.301                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                  Parameter Estimates                                  
## -------------------------------------------------------------------------------------
##       model     Beta    Std. Error    Std. Beta      t       Sig      lower    upper 
## -------------------------------------------------------------------------------------
## (Intercept)    2.872         0.547                 5.254    0.000     1.748    3.995 
##          X4    0.122         0.033        0.559    3.730    0.001     0.055    0.189 
##          X3    0.168         0.040        0.435    4.165    0.000     0.085    0.251 
##          X7    3.106         1.537        0.309    2.021    0.054    -0.053    6.266 
## -------------------------------------------------------------------------------------
k
## 
##                              Stepwise Selection Summary                              
## ------------------------------------------------------------------------------------
##                      Added/                   Adj.                                      
## Step    Variable    Removed     R-Square    R-Square     C(p)        AIC       RMSE     
## ------------------------------------------------------------------------------------
##    1       X4       addition       0.803       0.796    48.8550    68.4060    0.7087    
##    2       X3       addition       0.873       0.864    24.2130    57.2082    0.5792    
##    3       X7       addition       0.890       0.878    19.6670    54.8305    0.5487    
## ------------------------------------------------------------------------------------
plot(k)

# Stepwise AIC Regression for full model#
k<- ols_step_both_aic(model_wf_full_log)
## Stepwise Selection Method 
## -------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X4 
## 5 . X5 
## 6 . X6 
## 7 . X7 
## 8 . X8 
## 9 . X9 
## 
## 
## Variables Entered/Removed: 
## 
## - X4 added 
## - X3 added 
## - X7 added 
## - X8 added 
## - X9 added 
## - X6 added 
## 
## No more variables to be added or removed.
k
## 
## 
##                               Stepwise Summary                              
## --------------------------------------------------------------------------
## Variable     Method      AIC       RSS      Sum Sq     R-Sq      Adj. R-Sq 
## --------------------------------------------------------------------------
## X4          addition    68.406    14.063    57.330    0.80302      0.79599 
## X3          addition    57.208     9.057    62.335    0.87313      0.86373 
## X7          addition    54.830     7.828    63.565    0.89036      0.87771 
## X8          addition    54.522     7.248    64.144    0.89848      0.88223 
## X9          addition    44.504     4.856    66.537    0.93199      0.91782 
## X6          addition    39.161     3.801    67.591    0.94675      0.93286 
## --------------------------------------------------------------------------
plot(k)

# Stepwise Regression based on p values for X4 eliminated model#
k <- ols_step_both_p(model_wf_rm4_log)
## Stepwise Selection Method   
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X1 
## 2. X2 
## 3. X3 
## 4. X5 
## 5. X6 
## 6. X7 
## 7. X8 
## 8. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered/Removed: 
## 
## - X1 added 
## - X3 added 
## - X7 added 
## - X6 added 
## - X8 added 
## - X9 added 
## 
## No more variables to be added/removed.
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.971       RMSE               0.421 
## R-Squared               0.943       Coef. Var          6.618 
## Adj. R-Squared          0.928       MSE                0.178 
## Pred R-Squared          0.900       MAE                0.292 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.310         6         11.218    63.195    0.0000 
## Residual        4.083        23          0.178                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.307         0.410                  5.623    0.000     1.458     3.156 
##          X1     0.207         0.053        0.368     3.897    0.001     0.097     0.317 
##          X3     0.263         0.022        0.680    11.944    0.000     0.217     0.308 
##          X7     5.453         1.002        0.542     5.442    0.000     3.380     7.525 
##          X6    -0.532         0.144       -0.192    -3.688    0.001    -0.831    -0.234 
##          X8     0.613         0.137        0.495     4.462    0.000     0.329     0.897 
##          X9    -0.433         0.112       -0.435    -3.864    0.001    -0.665    -0.201 
## ----------------------------------------------------------------------------------------
k
## 
##                              Stepwise Selection Summary                               
## -------------------------------------------------------------------------------------
##                      Added/                   Adj.                                       
## Step    Variable    Removed     R-Square    R-Square      C(p)        AIC       RMSE     
## -------------------------------------------------------------------------------------
##    1       X1       addition       0.527       0.510    154.8520    94.7131    1.0987    
##    2       X3       addition       0.812       0.798     47.7990    68.9988    0.7050    
##    3       X7       addition       0.872       0.857     26.9890    59.5306    0.5934    
##    4       X6       addition       0.893       0.876     20.8070    56.0486    0.5523    
##    5       X8       addition       0.906       0.886     18.0270    54.3108    0.5297    
##    6       X9       addition       0.943       0.928      5.8470    41.3046    0.4213    
## -------------------------------------------------------------------------------------
plot(k)

# Stepwise AIC Regression for X4 eliminated model#
k<- ols_step_both_aic(model_wf_rm4_log)
## Stepwise Selection Method 
## -------------------------
## 
## Candidate Terms: 
## 
## 1 . X1 
## 2 . X2 
## 3 . X3 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Entered/Removed: 
## 
## - X1 added 
## - X3 added 
## - X7 added 
## - X6 added 
## - X8 added 
## - X9 added 
## 
## No more variables to be added or removed.
k
## 
## 
##                               Stepwise Summary                              
## --------------------------------------------------------------------------
## Variable     Method      AIC       RSS      Sum Sq     R-Sq      Adj. R-Sq 
## --------------------------------------------------------------------------
## X1          addition    94.713    33.799    37.594    0.52658      0.50967 
## X3          addition    68.999    13.418    57.974    0.81205      0.79813 
## X7          addition    59.531     9.155    62.237    0.87176      0.85696 
## X6          addition    56.049     7.626    63.766    0.89318      0.87609 
## X8          addition    54.311     6.733    64.660    0.90569      0.88604 
## X9          addition    41.305     4.083    67.310    0.94281      0.92789 
## --------------------------------------------------------------------------
plot(k)

# Stepwise Regression based on p values for X1 eliminated model#
k <- ols_step_both_p(model_wf_rm1_log)
## Stepwise Selection Method   
## ---------------------------
## 
## Candidate Terms: 
## 
## 1. X2 
## 2. X3 
## 3. X4 
## 4. X5 
## 5. X6 
## 6. X7 
## 7. X8 
## 8. X9 
## 
## We are selecting variables based on p value...
## 
## Variables Entered/Removed: 
## 
## - X4 added 
## - X3 added 
## - X7 added 
## 
## No more variables to be added/removed.
## 
## 
## Final Model Output 
## ------------------
## 
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.944       RMSE               0.549 
## R-Squared               0.890       Coef. Var          8.618 
## Adj. R-Squared          0.878       MSE                0.301 
## Pred R-Squared          0.854       MAE                0.414 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     63.565         3         21.188    70.378    0.0000 
## Residual        7.828        26          0.301                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                  Parameter Estimates                                  
## -------------------------------------------------------------------------------------
##       model     Beta    Std. Error    Std. Beta      t       Sig      lower    upper 
## -------------------------------------------------------------------------------------
## (Intercept)    2.872         0.547                 5.254    0.000     1.748    3.995 
##          X4    0.122         0.033        0.559    3.730    0.001     0.055    0.189 
##          X3    0.168         0.040        0.435    4.165    0.000     0.085    0.251 
##          X7    3.106         1.537        0.309    2.021    0.054    -0.053    6.266 
## -------------------------------------------------------------------------------------
k
## 
##                              Stepwise Selection Summary                              
## ------------------------------------------------------------------------------------
##                      Added/                   Adj.                                      
## Step    Variable    Removed     R-Square    R-Square     C(p)        AIC       RMSE     
## ------------------------------------------------------------------------------------
##    1       X4       addition       0.803       0.796    52.5890    68.4060    0.7087    
##    2       X3       addition       0.873       0.864    26.6180    57.2082    0.5792    
##    3       X7       addition       0.890       0.878    21.7450    54.8305    0.5487    
## ------------------------------------------------------------------------------------
plot(k)

# Stepwise AIC Regression for X1 eliminated model#
k<- ols_step_both_aic(model_wf_rm1_log)
## Stepwise Selection Method 
## -------------------------
## 
## Candidate Terms: 
## 
## 1 . X2 
## 2 . X3 
## 3 . X4 
## 4 . X5 
## 5 . X6 
## 6 . X7 
## 7 . X8 
## 8 . X9 
## 
## 
## Variables Entered/Removed: 
## 
## - X4 added 
## - X3 added 
## - X7 added 
## - X8 added 
## - X9 added 
## - X6 added 
## 
## No more variables to be added or removed.
k
## 
## 
##                               Stepwise Summary                              
## --------------------------------------------------------------------------
## Variable     Method      AIC       RSS      Sum Sq     R-Sq      Adj. R-Sq 
## --------------------------------------------------------------------------
## X4          addition    68.406    14.063    57.330    0.80302      0.79599 
## X3          addition    57.208     9.057    62.335    0.87313      0.86373 
## X7          addition    54.830     7.828    63.565    0.89036      0.87771 
## X8          addition    54.522     7.248    64.144    0.89848      0.88223 
## X9          addition    44.504     4.856    66.537    0.93199      0.91782 
## X6          addition    39.161     3.801    67.591    0.94675      0.93286 
## --------------------------------------------------------------------------
plot(k)

  • Model 437896
# build model 437896
model_wf_437896_log <- lm(log(y) ~ X4 + X3 + X7 + X8 + X9 + X6, data=table_wf)
ols_regress(model_wf_437896_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.973       RMSE               0.407 
## R-Squared               0.947       Coef. Var          6.385 
## Adj. R-Squared          0.933       MSE                0.165 
## Pred R-Squared          0.908       MAE                0.273 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.591         6         11.265     68.16    0.0000 
## Residual        3.801        23          0.165                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.692         0.445                  6.046    0.000     1.771     3.613 
##          X4     0.109         0.026        0.499     4.244    0.000     0.056     0.162 
##          X3     0.184         0.032        0.476     5.698    0.000     0.117     0.251 
##          X7     4.085         1.213        0.406     3.367    0.003     1.575     6.595 
##          X8     0.612         0.133        0.493     4.614    0.000     0.337     0.886 
##          X9    -0.448         0.108       -0.450    -4.135    0.000    -0.672    -0.224 
##          X6    -0.368         0.146       -0.133    -2.526    0.019    -0.669    -0.066 
## ----------------------------------------------------------------------------------------
# Collinearity Diagnostics #
ols_coll_diag(model_wf_437896_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 6 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X4            0.167  5.97
## 2 X3            0.332  3.01
## 3 X7            0.159  6.28
## 4 X8            0.202  4.94
## 5 X9            0.195  5.12
## 6 X6            0.839  1.19
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##   Eigenvalue Condition Index    intercept           X4          X3           X7           X8           X9         X6
## 1 6.06603799        1.000000 0.0007146972 0.0012879224 0.001577168 6.255392e-04 0.0008014993 0.0009066212 0.00304547
## 2 0.33834763        4.234196 0.0025641623 0.1040613624 0.005790171 1.173277e-02 0.0084021537 0.0065708241 0.02244379
## 3 0.31443225        4.392270 0.0007827736 0.0005853923 0.117541638 3.247682e-03 0.0152300675 0.0281423928 0.02635038
## 4 0.18092020        5.790406 0.0043202653 0.0171208486 0.087238632 1.883680e-02 0.0099587626 0.0185479834 0.30810079
## 5 0.07065103        9.266022 0.1767257312 0.0717821748 0.003008880 4.520519e-02 0.0114424989 0.0089921454 0.54826833
## 6 0.01847255       18.121293 0.0001449205 0.0053427347 0.008960972 7.477679e-06 0.9456383423 0.9366327464 0.03064530
## 7 0.01113836       23.336833 0.8147474499 0.7998195649 0.775882539 9.203445e-01 0.0085266757 0.0002072867 0.06114594
#Model Fit Assessment
ols_plot_diagnostics(model_wf_437896_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_437896_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.9837263
# Residual Normality Test
ols_test_normality(model_wf_437896_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9728         0.6175 
## Kolmogorov-Smirnov        0.0997         0.8982 
## Cramer-von Mises          4.8429         0.0000 
## Anderson-Darling          0.2996         0.5612 
## -----------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_437896_log)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_437896_log)

  • Model 437
# build model 437
model_wf_437_log <- lm(log(y) ~ X4 + X3 + X7, data=table_wf)
ols_regress(model_wf_437_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.944       RMSE               0.549 
## R-Squared               0.890       Coef. Var          8.618 
## Adj. R-Squared          0.878       MSE                0.301 
## Pred R-Squared          0.854       MAE                0.414 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     63.565         3         21.188    70.378    0.0000 
## Residual        7.828        26          0.301                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                  Parameter Estimates                                  
## -------------------------------------------------------------------------------------
##       model     Beta    Std. Error    Std. Beta      t       Sig      lower    upper 
## -------------------------------------------------------------------------------------
## (Intercept)    2.872         0.547                 5.254    0.000     1.748    3.995 
##          X4    0.122         0.033        0.559    3.730    0.001     0.055    0.189 
##          X3    0.168         0.040        0.435    4.165    0.000     0.085    0.251 
##          X7    3.106         1.537        0.309    2.021    0.054    -0.053    6.266 
## -------------------------------------------------------------------------------------
# Collinearity Diagnostics #
ols_coll_diag(model_wf_437_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 3 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X4            0.188  5.32
## 2 X3            0.386  2.59
## 3 X7            0.181  5.53
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##   Eigenvalue Condition Index   intercept          X4          X3          X7
## 1 3.51967088        1.000000 0.002498604 0.004729201 0.005661232 0.002143973
## 2 0.30192504        3.414298 0.006749341 0.054452269 0.142053567 0.018736742
## 3 0.16605489        4.603893 0.068513979 0.150366864 0.078164912 0.024726360
## 4 0.01234919       16.882304 0.922238076 0.790451666 0.774120289 0.954392926
#Model Fit Assessment
ols_plot_diagnostics(model_wf_437_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_437_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.9856766
# Residual Normality Test
ols_test_normality(model_wf_437_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9765         0.7267 
## Kolmogorov-Smirnov        0.1033         0.8736 
## Cramer-von Mises          3.1908         0.0000 
## Anderson-Darling          0.3511         0.4469 
## -----------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_437_log)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_437_log)

  • Model 137689
# build model 137689
model_wf_137689_log <- lm(log(y) ~ X1 + X3 + X7 + X6 + X8 + X9, data=table_wf)
ols_regress(model_wf_137689_log)
##                         Model Summary                         
## -------------------------------------------------------------
## R                       0.971       RMSE               0.421 
## R-Squared               0.943       Coef. Var          6.618 
## Adj. R-Squared          0.928       MSE                0.178 
## Pred R-Squared          0.900       MAE                0.292 
## -------------------------------------------------------------
##  RMSE: Root Mean Square Error 
##  MSE: Mean Square Error 
##  MAE: Mean Absolute Error 
## 
##                                ANOVA                                
## -------------------------------------------------------------------
##                Sum of                                              
##               Squares        DF    Mean Square      F         Sig. 
## -------------------------------------------------------------------
## Regression     67.310         6         11.218    63.195    0.0000 
## Residual        4.083        23          0.178                     
## Total          71.393        29                                    
## -------------------------------------------------------------------
## 
##                                   Parameter Estimates                                    
## ----------------------------------------------------------------------------------------
##       model      Beta    Std. Error    Std. Beta      t        Sig      lower     upper 
## ----------------------------------------------------------------------------------------
## (Intercept)     2.307         0.410                  5.623    0.000     1.458     3.156 
##          X1     0.207         0.053        0.368     3.897    0.001     0.097     0.317 
##          X3     0.263         0.022        0.680    11.944    0.000     0.217     0.308 
##          X7     5.453         1.002        0.542     5.442    0.000     3.380     7.525 
##          X6    -0.532         0.144       -0.192    -3.688    0.001    -0.831    -0.234 
##          X8     0.613         0.137        0.495     4.462    0.000     0.329     0.897 
##          X9    -0.433         0.112       -0.435    -3.864    0.001    -0.665    -0.201 
## ----------------------------------------------------------------------------------------
# Collinearity Diagnostics #
ols_coll_diag(model_wf_137689_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 6 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X1            0.279  3.58
## 2 X3            0.768  1.30
## 3 X7            0.251  3.99
## 4 X6            0.917  1.09
## 5 X8            0.202  4.94
## 6 X9            0.196  5.10
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##   Eigenvalue Condition Index    intercept          X1          X3           X7          X6           X8           X9
## 1 5.87754632        1.000000 0.0009583976 0.002753729 0.003752062 0.0010622321 0.003554176 0.0008552526 0.0009724289
## 2 0.55370282        3.258064 0.0022409295 0.184392777 0.048522556 0.0065506985 0.008922660 0.0011708384 0.0004973735
## 3 0.29276462        4.480626 0.0006141795 0.021479187 0.183073381 0.0001146981 0.036353961 0.0263007081 0.0424526993
## 4 0.15641976        6.129884 0.0050342867 0.054272760 0.378429650 0.0153101550 0.417215036 0.0036091737 0.0092883543
## 5 0.08151723        8.491283 0.1288772252 0.136506302 0.004864520 0.1464814595 0.482676303 0.0146393598 0.0085093969
## 6 0.02005845       17.117856 0.6268955520 0.471310996 0.317220373 0.5780521912 0.030882409 0.1844189407 0.2640851038
## 7 0.01799079       18.074773 0.2353794295 0.129284248 0.064137458 0.2524285656 0.020395454 0.7690057267 0.6741946434
#Model Fit Assessment
ols_plot_diagnostics(model_wf_137689_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_137689_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.988106
# Residual Normality Test
ols_test_normality(model_wf_137689_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9769         0.7382 
## Kolmogorov-Smirnov        0.0771         0.9881 
## Cramer-von Mises          4.4689         0.0000 
## Anderson-Darling          0.1644         0.9350 
## -----------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_137689_log)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_137689_log)

  • Other Models
# build X1*X8 eliminated log model
model_wf_18rm4_log <- lm(log(y) ~ X1*X8 + X3 + X6 + X7 + X9, data=table_wf)
summary(model_wf_18rm4_log)
## 
## Call:
## lm(formula = log(y) ~ X1 * X8 + X3 + X6 + X7 + X9, data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.66685 -0.22572 -0.06507  0.25283  0.65751 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.61673    0.41271   6.340 2.22e-06 ***
## X1           0.07506    0.08138   0.922 0.366299    
## X8           0.50281    0.13939   3.607 0.001564 ** 
## X3           0.26363    0.02059  12.803 1.14e-11 ***
## X6          -0.54299    0.13525  -4.015 0.000582 ***
## X7           5.63198    0.94231   5.977 5.14e-06 ***
## X9          -0.44919    0.10520  -4.270 0.000312 ***
## X1:X8        0.04265    0.02075   2.056 0.051887 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3946 on 22 degrees of freedom
## Multiple R-squared:  0.952,  Adjusted R-squared:  0.9368 
## F-statistic: 62.37 on 7 and 22 DF,  p-value: 4.821e-13
# build X1*X8 eliminated log model
table_wf_resi <- table_wf%>% mutate(x1t8=X1*X8)
model_wf_1time8_log <- lm(log(y) ~ x1t8 + X3 + X6 + X7+ X9 , data=table_wf_resi)
summary(model_wf_1time8_log)
## 
## Call:
## lm(formula = log(y) ~ x1t8 + X3 + X6 + X7 + X9, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.88041 -0.27426  0.04247  0.31811  0.87796 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  3.00659    0.47894   6.278 1.73e-06 ***
## x1t8         0.06871    0.01488   4.618  0.00011 ***
## X3           0.26587    0.02422  10.976 7.74e-11 ***
## X6          -0.46614    0.16072  -2.900  0.00785 ** 
## X7           5.46737    0.89636   6.100 2.67e-06 ***
## X9          -0.14416    0.06888  -2.093  0.04711 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.4768 on 24 degrees of freedom
## Multiple R-squared:  0.9236, Adjusted R-squared:  0.9076 
## F-statistic:    58 on 5 and 24 DF,  p-value: 1.291e-12
# build X1*X4 eliminated log model
table_wf_resi <- table_wf%>% mutate(x1t4=X1*X4)
model_wf_1time4_log <- lm(log(y) ~ x1t4 + X3 + X6 + X7+ X8+ X9, data=table_wf_resi)
summary(model_wf_1time4_log)
## 
## Call:
## lm(formula = log(y) ~ x1t4 + X3 + X6 + X7 + X8 + X9, data = table_wf_resi)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -1.0156 -0.2543  0.0070  0.2564  0.6057 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.139295   0.393809   5.432 1.61e-05 ***
## x1t4         0.009493   0.002504   3.791 0.000945 ***
## X3           0.274661   0.021379  12.847 5.60e-12 ***
## X6          -0.557992   0.145847  -3.826 0.000866 ***
## X7           6.105175   0.885304   6.896 4.96e-07 ***
## X8           0.615842   0.138876   4.434 0.000191 ***
## X9          -0.435302   0.113260  -3.843 0.000829 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.4259 on 23 degrees of freedom
## Multiple R-squared:  0.9416, Adjusted R-squared:  0.9263 
## F-statistic: 61.76 on 6 and 23 DF,  p-value: 4.956e-13
# build X1/X4 eliminated log model
table_wf_resi <- table_wf%>% mutate(x14=X1/X4)
model_wf_1per4_log <- lm(log(y) ~ x14 + X3 + X6 + X7+ X8+ X9, data=table_wf_resi)
summary(model_wf_1per4_log)
## 
## Call:
## lm(formula = log(y) ~ x14 + X3 + X6 + X7 + X8 + X9, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.95043 -0.27043 -0.00489  0.32385  0.65175 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.32714    0.40155   5.795 6.64e-06 ***
## x14          5.05887    1.23730   4.089 0.000451 ***
## X3           0.24874    0.02282  10.901 1.47e-10 ***
## X6          -0.50882    0.14176  -3.589 0.001551 ** 
## X7           4.61674    1.13829   4.056 0.000490 ***
## X8           0.61018    0.13471   4.530 0.000150 ***
## X9          -0.42716    0.10978  -3.891 0.000737 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.4131 on 23 degrees of freedom
## Multiple R-squared:  0.945,  Adjusted R-squared:  0.9307 
## F-statistic: 65.89 on 6 and 23 DF,  p-value: 2.476e-13
# build X4*X3 eliminated log model
model_wf_43rm1_log <- lm(log(y) ~ X9 + X4*X3 + X6 + X7 + X8 , data=table_wf)
summary(model_wf_43rm1_log)
## 
## Call:
## lm(formula = log(y) ~ X9 + X4 * X3 + X6 + X7 + X8, data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.98067 -0.15171  0.02747  0.19477  0.52079 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.084490   0.423701   4.920 6.40e-05 ***
## X9          -0.455031   0.091844  -4.954 5.88e-05 ***
## X4           0.278500   0.057899   4.810 8.35e-05 ***
## X3           0.369731   0.064897   5.697 9.92e-06 ***
## X6          -0.303098   0.125087  -2.423  0.02407 *  
## X7           2.631732   1.127027   2.335  0.02906 *  
## X8           0.617333   0.112437   5.491 1.62e-05 ***
## X4:X3       -0.025163   0.007966  -3.159  0.00455 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3448 on 22 degrees of freedom
## Multiple R-squared:  0.9634, Adjusted R-squared:  0.9517 
## F-statistic: 82.66 on 7 and 22 DF,  p-value: 2.547e-14
# build X4*X9 eliminated log model
model_wf_49rm1_log <- lm(log(y) ~ X3 + X4*X9 + X6 + X7 + X8 , data=table_wf)
summary(model_wf_49rm1_log)
## 
## Call:
## lm(formula = log(y) ~ X3 + X4 * X9 + X6 + X7 + X8, data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.76768 -0.14948 -0.00926  0.15144  0.78466 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  3.258424   0.489062   6.663 1.07e-06 ***
## X3           0.188917   0.030043   6.288 2.50e-06 ***
## X4           0.052531   0.035346   1.486 0.151417    
## X9          -0.585947   0.119054  -4.922 6.37e-05 ***
## X6          -0.360247   0.135120  -2.666 0.014109 *  
## X7           4.120858   1.126224   3.659 0.001380 ** 
## X8           0.540204   0.127385   4.241 0.000335 ***
## X4:X9        0.016971   0.007833   2.167 0.041371 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3774 on 22 degrees of freedom
## Multiple R-squared:  0.9561, Adjusted R-squared:  0.9422 
## F-statistic: 68.48 on 7 and 22 DF,  p-value: 1.826e-13
# build X4*X9 eliminated log model
model_wf_48rm1_log <- lm(log(y) ~ X3 + X4*X8 + X6 + X7 + X9 , data=table_wf)
summary(model_wf_48rm1_log)
## 
## Call:
## lm(formula = log(y) ~ X3 + X4 * X8 + X6 + X7 + X9, data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.59137 -0.25522 -0.04126  0.13881  0.72534 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  3.346540   0.457447   7.316 2.52e-07 ***
## X3           0.188368   0.028445   6.622 1.17e-06 ***
## X4           0.042547   0.032959   1.291 0.210138    
## X8           0.432146   0.133408   3.239 0.003766 ** 
## X6          -0.396641   0.128504  -3.087 0.005391 ** 
## X7           4.253985   1.069457   3.978 0.000637 ***
## X9          -0.512646   0.098122  -5.225 3.06e-05 ***
## X4:X8        0.022403   0.008077   2.774 0.011078 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3578 on 22 degrees of freedom
## Multiple R-squared:  0.9605, Adjusted R-squared:  0.948 
## F-statistic: 76.52 on 7 and 22 DF,  p-value: 5.719e-14
# build X4*X9 eliminated log model
model_wf_47rm1_log <- lm(log(y) ~ X3 + X4*X7 + X6 + X9 + X8 , data=table_wf)
summary(model_wf_47rm1_log)
## 
## Call:
## lm(formula = log(y) ~ X3 + X4 * X7 + X6 + X9 + X8, data = table_wf)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -0.8675 -0.2486 -0.0506  0.2433  0.8172 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  3.20436    0.47536   6.741 8.95e-07 ***
## X3           0.27436    0.05124   5.355 2.24e-05 ***
## X4          -0.12965    0.11223  -1.155 0.260387    
## X7           0.48550    2.00022   0.243 0.810469    
## X6          -0.27531    0.14146  -1.946 0.064515 .  
## X9          -0.43936    0.10048  -4.373 0.000243 ***
## X8           0.60482    0.12299   4.918 6.43e-05 ***
## X4:X7        0.53301    0.24489   2.177 0.040526 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3771 on 22 degrees of freedom
## Multiple R-squared:  0.9562, Adjusted R-squared:  0.9422 
## F-statistic: 68.59 on 7 and 22 DF,  p-value: 1.793e-13
# build X4/X9 eliminated log model
table_wf_resi <- table_wf%>% mutate(x4p9=X4/X9)
model_wf_4per9_log <- lm(log(y) ~ X3 + x4p9 + X6 + X7 + X8 , data=table_wf_resi)
summary(model_wf_4per9_log)
## 
## Call:
## lm(formula = log(y) ~ X3 + x4p9 + X6 + X7 + X8, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.91537 -0.43557  0.01048  0.42799  0.87454 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  1.55063    0.45618   3.399  0.00236 ** 
## X3           0.21359    0.04353   4.907 5.27e-05 ***
## x4p9         0.15570    0.06434   2.420  0.02345 *  
## X6          -0.42944    0.18765  -2.288  0.03122 *  
## X7           6.03307    1.18525   5.090 3.31e-05 ***
## X8           0.38030    0.12355   3.078  0.00515 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.5559 on 24 degrees of freedom
## Multiple R-squared:  0.8961, Adjusted R-squared:  0.8745 
## F-statistic:  41.4 on 5 and 24 DF,  p-value: 4.928e-11
# build X3/X4vX8*X9 eliminated log model
model_wf_34v89_log <- lm(log(y) ~ X3*X4 + X8*X9 + X6 + X7, data=table_wf_resi)
summary(model_wf_34v89_log)
## 
## Call:
## lm(formula = log(y) ~ X3 * X4 + X8 * X9 + X6 + X7, data = table_wf_resi)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -0.9927 -0.1277  0.0319  0.1526  0.5001 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.525631   0.556505   4.538 0.000179 ***
## X3           0.366824   0.064282   5.706 1.15e-05 ***
## X4           0.281727   0.057373   4.910 7.41e-05 ***
## X8           0.491896   0.152332   3.229 0.004022 ** 
## X9          -0.594870   0.147344  -4.037 0.000594 ***
## X6          -0.309074   0.123914  -2.494 0.021037 *  
## X7           2.496248   1.121208   2.226 0.037067 *  
## X3:X4       -0.025692   0.007897  -3.253 0.003801 ** 
## X8:X9        0.041608   0.034502   1.206 0.241241    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3413 on 21 degrees of freedom
## Multiple R-squared:  0.9657, Adjusted R-squared:  0.9527 
## F-statistic:    74 on 8 and 21 DF,  p-value: 1.212e-13
# build X3/X4vX8*X9 eliminated log model
model_wf_34v89v67_log <- lm(log(y) ~ X3*X4 + X8*X9 + X6*X7, data=table_wf_resi)
summary(model_wf_34v89v67_log)
## 
## Call:
## lm(formula = log(y) ~ X3 * X4 + X8 * X9 + X6 * X7, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1.03364 -0.10378  0.05522  0.13870  0.45748 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.317103   0.711605   3.256  0.00395 ** 
## X3           0.381522   0.072162   5.287 3.57e-05 ***
## X4           0.302852   0.072900   4.154  0.00049 ***
## X8           0.490488   0.155212   3.160  0.00492 ** 
## X9          -0.593150   0.150145  -3.951  0.00079 ***
## X6          -0.124687   0.400702  -0.311  0.75889    
## X7           2.803035   1.305757   2.147  0.04427 *  
## X3:X4       -0.028436   0.009836  -2.891  0.00903 ** 
## X8:X9        0.040922   0.035176   1.163  0.25839    
## X6:X7       -0.492695   1.016184  -0.485  0.63305    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3477 on 20 degrees of freedom
## Multiple R-squared:  0.9661, Adjusted R-squared:  0.9509 
## F-statistic: 63.41 on 9 and 20 DF,  p-value: 9.715e-13
# build X8/X9vX4*X3 eliminated log model
table_wf_resi <- table_wf%>% mutate(x8p9=X8/X9)
model_wf_8per9v43_log <- lm(log(y) ~ x8p9 + X4*X3 + X6 + X7, data=table_wf_resi)
summary(model_wf_8per9v43_log)
## 
## Call:
## lm(formula = log(y) ~ x8p9 + X4 * X3 + X6 + X7, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1.15603 -0.10698  0.03199  0.12560  0.37627 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  1.470537   0.376961   3.901 0.000719 ***
## x8p9         1.306885   0.200541   6.517 1.19e-06 ***
## X4           0.301798   0.051706   5.837 6.01e-06 ***
## X3           0.338116   0.058068   5.823 6.21e-06 ***
## X6          -0.369739   0.113542  -3.256 0.003476 ** 
## X7           1.965609   1.006510   1.953 0.063105 .  
## X4:X3       -0.025190   0.007109  -3.543 0.001735 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3078 on 23 degrees of freedom
## Multiple R-squared:  0.9695, Adjusted R-squared:  0.9615 
## F-statistic: 121.8 on 6 and 23 DF,  p-value: 2.982e-16
# build X6/7vX8/X9vX4X3 eliminated log model
table_wf_resi <- table_wf%>% mutate(x4t3=X4*X3,x8p9=X8/X9,x6p7=X6/X7)
model_wf_6p7v8p9v4t3_log <- lm(log(y) ~ x8p9 + x4t3 + x6p7, data=table_wf_resi)
summary(model_wf_6p7v8p9v4t3_log)
## 
## Call:
## lm(formula = log(y) ~ x8p9 + x4t3 + x6p7, data = table_wf_resi)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -0.9953 -0.3390  0.1827  0.3281  0.8889 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  4.367696   0.367063  11.899 5.05e-12 ***
## x8p9         1.448120   0.371254   3.901 0.000606 ***
## x4t3         0.022879   0.001768  12.939 7.75e-13 ***
## x6p7        -0.269704   0.048274  -5.587 7.19e-06 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.5721 on 26 degrees of freedom
## Multiple R-squared:  0.8808, Adjusted R-squared:  0.8671 
## F-statistic: 64.04 on 3 and 26 DF,  p-value: 3.868e-12
library(huxtable)
huxreg(model_wf_8per9v43_log, model_wf_43rm1_log, model_wf_6p7v8p9v4t3_log, model_wf_34v89_log, model_wf_34v89v67_log)
(1) (2) (3) (4) (5)
(Intercept) 1.471 *** 2.084 *** 4.368 *** 2.526 *** 2.317 ** 
(0.377)    (0.424)    (0.367)    (0.557)    (0.712)   
x8p9 1.307 ***          1.448 ***                  
(0.201)             (0.371)                     
X4 0.302 *** 0.279 ***          0.282 *** 0.303 ***
(0.052)    (0.058)             (0.057)    (0.073)   
X3 0.338 *** 0.370 ***          0.367 *** 0.382 ***
(0.058)    (0.065)             (0.064)    (0.072)   
X6 -0.370 **  -0.303 *            -0.309 *   -0.125    
(0.114)    (0.125)             (0.124)    (0.401)   
X7 1.966     2.632 *            2.496 *   2.803 *  
(1.007)    (1.127)             (1.121)    (1.306)   
X4:X3 -0.025 **  -0.025 **                            
(0.007)    (0.008)                              
X9          -0.455 ***          -0.595 *** -0.593 ***
         (0.092)             (0.147)    (0.150)   
X8          0.617 ***          0.492 **  0.490 ** 
         (0.112)             (0.152)    (0.155)   
x4t3                   0.023 ***                  
                  (0.002)                     
x6p7                   -0.270 ***                  
                  (0.048)                     
X3:X4                            -0.026 **  -0.028 ** 
                           (0.008)    (0.010)   
X8:X9                            0.042     0.041    
                           (0.035)    (0.035)   
X6:X7                                     -0.493    
                                    (1.016)   
N 30         30         30         30         30        
R2 0.969     0.963     0.881     0.966     0.966    
logLik -3.233     -5.970     -23.668     -4.966     -4.790    
AIC 22.467     29.940     57.336     29.931     31.581    
*** p < 0.001; ** p < 0.01; * p < 0.05.

Final models

# Check model 43rm1
summary(model_wf_43rm1_log)
## 
## Call:
## lm(formula = log(y) ~ X9 + X4 * X3 + X6 + X7 + X8, data = table_wf)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -0.98067 -0.15171  0.02747  0.19477  0.52079 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  2.084490   0.423701   4.920 6.40e-05 ***
## X9          -0.455031   0.091844  -4.954 5.88e-05 ***
## X4           0.278500   0.057899   4.810 8.35e-05 ***
## X3           0.369731   0.064897   5.697 9.92e-06 ***
## X6          -0.303098   0.125087  -2.423  0.02407 *  
## X7           2.631732   1.127027   2.335  0.02906 *  
## X8           0.617333   0.112437   5.491 1.62e-05 ***
## X4:X3       -0.025163   0.007966  -3.159  0.00455 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3448 on 22 degrees of freedom
## Multiple R-squared:  0.9634, Adjusted R-squared:  0.9517 
## F-statistic: 82.66 on 7 and 22 DF,  p-value: 2.547e-14
Anova(model_wf_43rm1_log)
Sum Sq Df F value Pr(>F)
2.92  1 24.5  5.88e-05
2.98  1 25    5.21e-05
5.37  1 45.1  9.41e-07
0.698 1 5.87 0.0241  
0.648 1 5.45 0.0291  
3.58  1 30.1  1.62e-05
1.19  1 9.98 0.00455 
2.62  22            
# Collinearity Diagnostics #
ols_coll_diag(model_wf_43rm1_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 7 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 X9           0.195   5.12
## 2 X4           0.0237 42.2 
## 3 X3           0.0590 17.0 
## 4 X6           0.817   1.22
## 5 X7           0.133   7.53
## 6 X8           0.202   4.94
## 7 X4:X3        0.0178 56.2 
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##    Eigenvalue Condition Index    intercept           X9           X4           X3          X6           X7           X8        X4:X3
## 1 6.849877222        1.000000 0.0004368075 0.0006897602 1.496807e-04 0.0002285053 0.002283090 4.044146e-04 0.0006108017 1.135503e-04
## 2 0.506188292        3.678624 0.0017133811 0.0123285079 2.991467e-03 0.0020333293 0.008740945 1.564570e-04 0.0100388856 4.487306e-03
## 3 0.327186817        4.575552 0.0020505192 0.0026372503 6.863538e-03 0.0150692744 0.039657261 1.296746e-02 0.0002121972 5.407823e-05
## 4 0.206723233        5.756344 0.0042167463 0.0390116319 3.107828e-05 0.0019916892 0.218817320 1.668224e-02 0.0237297448 1.469039e-03
## 5 0.077380498        9.408614 0.1085282487 0.0084232073 1.337524e-03 0.0101590146 0.608327249 2.615214e-02 0.0111352210 3.143882e-03
## 6 0.018473557       19.256003 0.0001865725 0.9361576352 6.056134e-04 0.0018770644 0.030162023 2.635028e-06 0.9448115291 9.952803e-06
## 7 0.011198260       24.732392 0.6861535970 0.0001731810 8.629642e-02 0.1695026022 0.053935286 7.290746e-01 0.0090430576 1.871541e-03
## 8 0.002972121       48.007394 0.1967141276 0.0005788262 9.017247e-01 0.7991385206 0.038076827 2.145601e-01 0.0004185630 9.888506e-01
#Model Fit Assessment
ols_plot_diagnostics(model_wf_43rm1_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_43rm1_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.9540746
# Residual Normality Test
ols_test_normality(model_wf_43rm1_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.9234         0.0329 
## Kolmogorov-Smirnov        0.1067         0.8490 
## Cramer-von Mises          5.6033         0.0000 
## Anderson-Darling          0.6037         0.1064 
## -----------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_43rm1_log)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_43rm1_log)

##################################################
# Check model 8per9v43
summary(model_wf_8per9v43_log)
## 
## Call:
## lm(formula = log(y) ~ x8p9 + X4 * X3 + X6 + X7, data = table_wf_resi)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -1.15603 -0.10698  0.03199  0.12560  0.37627 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  1.470537   0.376961   3.901 0.000719 ***
## x8p9         1.306885   0.200541   6.517 1.19e-06 ***
## X4           0.301798   0.051706   5.837 6.01e-06 ***
## X3           0.338116   0.058068   5.823 6.21e-06 ***
## X6          -0.369739   0.113542  -3.256 0.003476 ** 
## X7           1.965609   1.006510   1.953 0.063105 .  
## X4:X3       -0.025190   0.007109  -3.543 0.001735 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 0.3078 on 23 degrees of freedom
## Multiple R-squared:  0.9695, Adjusted R-squared:  0.9615 
## F-statistic: 121.8 on 6 and 23 DF,  p-value: 2.982e-16
Anova(model_wf_8per9v43_log)
Sum Sq Df F value Pr(>F)
4.02  1 42.5  1.19e-06
4.33  1 45.7  6.81e-07
3.57  1 37.7  2.9e-06 
1     1 10.6  0.00348 
0.361 1 3.81 0.0631  
1.19  1 12.6  0.00174 
2.18  23            
# Collinearity Diagnostics #
ols_coll_diag(model_wf_8per9v43_log)
## Tolerance and Variance Inflation Factor
## ---------------------------------------
## # A tibble: 6 x 3
##   Variables Tolerance   VIF
##   <chr>         <dbl> <dbl>
## 1 x8p9         0.799   1.25
## 2 X4           0.0237 42.2 
## 3 X3           0.0587 17.0 
## 4 X6           0.790   1.27
## 5 X7           0.133   7.54
## 6 X4:X3        0.0178 56.1 
## 
## 
## Eigenvalue and Condition Index
## ------------------------------
##    Eigenvalue Condition Index    intercept        x8p9           X4           X3           X6           X7        X4:X3
## 1 6.049066882        1.000000 0.0005639673 0.001801295 1.910491e-04 3.048624e-04 0.0028874680 0.0005100875 0.0001502535
## 2 0.471092706        3.583365 0.0039682478 0.030912947 5.900619e-03 2.705445e-06 0.0356653386 0.0002036690 0.0035837574
## 3 0.319285006        4.352662 0.0006948402 0.001297022 2.093687e-03 1.678622e-02 0.0026179833 0.0241742127 0.0025719703
## 4 0.099732633        7.787993 0.0266638329 0.104151970 3.325706e-06 9.240665e-04 0.8680992283 0.0038832498 0.0007316888
## 5 0.046120241       11.452450 0.1334444067 0.855144593 6.711810e-03 3.115010e-02 0.0009313298 0.0163399095 0.0028338370
## 6 0.011734885       22.704125 0.6194113254 0.004562458 8.097080e-02 1.615361e-01 0.0560377240 0.7387387362 0.0019824522
## 7 0.002967648       45.147947 0.2152533798 0.002129714 9.041287e-01 7.892959e-01 0.0337609279 0.2161501352 0.9881460408
#Model Fit Assessment
ols_plot_diagnostics(model_wf_8per9v43_log)

# Part & Partial Correlations
ols_test_correlation(model_wf_8per9v43_log) # Correlation between observed residuals and expected residuals under normality.
## [1] 0.8661026
# Residual Normality Test
ols_test_normality(model_wf_8per9v43_log) # Test for detecting violation of normality assumption. #If p-value is bigger, then no problem of non-normality #
## -----------------------------------------------
##        Test             Statistic       pvalue  
## -----------------------------------------------
## Shapiro-Wilk              0.7735         0.0000 
## Kolmogorov-Smirnov        0.1678         0.3292 
## Cramer-von Mises          6.3214         0.0000 
## Anderson-Darling          1.3999         0.0010 
## -----------------------------------------------
# Variable Contributions
ols_plot_added_variable(model_wf_8per9v43_log)

# Residual Plus Component Plot
ols_plot_comp_plus_resid(model_wf_8per9v43_log)

# Check PRESS Statistic
ols_press(model_wf_full)
## [1] 15880486
ols_press(model_wf_full_log)
## [1] 8.136733
ols_press(model_wf_437896_log)
## [1] 6.538275
ols_press(model_wf_437_log)
## [1] 10.43262
ols_press(model_wf_137689_log)
## [1] 7.114336
ols_press(model_wf_43rm1_log)
## [1] 4.962623
ols_press(model_wf_8per9v43_log)
## [1] 3.281548
# prediction power
ols_pred_rsq(model_wf_437896_log)
## [1] 0.908418
ols_pred_rsq(model_wf_137689_log)
## [1] 0.900349
ols_pred_rsq(model_wf_43rm1_log)
## [1] 0.9304882
ols_pred_rsq(model_wf_8per9v43_log)
## [1] 0.9540352

library(texreg)
# Pretty print regression results on screen
lm(mpg ~ wt, data=my_df) %>% screenreg
texreg::screenreg(l=list(model_2_7_8))

# visualizng
library(GGally)
ggpairs(data=table_b1[c(1,3,8,9)])

# Correlation
cor(table_b1)

# Half correlation matrix:
library(corrr)
mtcars %>% correlate() %>% shave() %>% fashion()
# Visulize correlation matrix:
mtcars %>% correlate() %>% shave() %>% rplot()

# Scatterplot Matrix
mtcars[1:6] %>% plot
# Better looking version
library(ggfortify)
model_2_7_8 %>% autoplot()

# Confidence interval of coefficients
lm(mpg ~ wt + cyl, data=mtcars) %>% confint()

# Hypothesis testing of nested models
lm_mpg_wt <- lm(mpg ~ wt, data=mtcars)
lm_mpg_wt.cyl <- lm(mpg ~ wt + cyl, data=mtcars)
anova(lm_mpg_wt, lm_mpg_wt.cyl)

# convert mpg to kilometers per liter
mtcars %>% mutate(kmpl = mpg * 0.425144) %>% select(mpg, kmpl) %>% filter(mpg > 20)
nrow()
mtcars %>% group_by(am) %>% 
  summarize(n=n(),
            mean_mpg=mean(mpg),
            sd_mpg=sd(mpg),
            min_mpg=min(mpg),
            max_mpg=max(mpg)
mtcars %>% arrange(desc(mpg))

# mean & sd
mtcars %>% summarize(am_mean=mean(am), am_sd=sd(am))
# Frequencies by categories
mtcars %>% group_by(am) %>% tally

# Assume we want to combine LA + SD to Southern CA and Bay Area and Sacramento
## as Northern CA
(californiatod <- californiatod %>% 
  mutate(transit_level=case_when(
    transit>0.4~"high",
    transit>0.2~"medium",
    TRUE ~ "low")))


## General linear F test
fit_R <- lm(mpg ~ wt, data=mtcars)
fit_F <- lm(mpg ~ wt + cyl, data=mtcars)
anova(fit_R, fit_F)

SSE_R <- resid(fit_R)^2 %>% sum
SSE_F <- resid(fit_F)^2 %>% sum
df_R <- df.residual(fit_R)
df_F <- df.residual(fit_F)
F_val <- ((SSE_R - SSE_F)/(df_R - df_F))/(SSE_F/df_F)

# Look up the critical F value for alpha=0.05
alpha <- 0.05
qf(alpha, (df_R - df_F), df_F, lower.tail=F)
# Alternatively, find the p-value corresponding to our F_val
pf(F_val, (df_R - df_F), df_F, lower.tail=F)
n <- nrow(mtcars)          # number of observations
k <- length(coef(fit_R))   # number of coefficients
## Calculate R2 and adjusted R2 manually
TSS <- sd(mtcars$mpg)^2 * (n - 1)
# OR
TSS <- var(mtcars$mpg) * (n - 1)
(R2_R <- 1 - SSE_R/TSS)
(R2_R_adj <- 1 - (SSE_R/(n - k))/(TSS/(n - 1)))

# Interaction Terms
huxreg(
  lm(houseval ~ transit, data=californiatod),
  lm(houseval ~ transit * railtype, data=californiatod),
  lm(houseval ~ transit * region, data=californiatod),
  lm(houseval ~ transit * CA, data=californiatod))

# redefine the region variables with a new reference category (4 for SD)
catod2 <- californiatod %>% mutate(region = relevel(as.factor(region), ref = 4))
lm(houseval ~ region, data=catod2)  %>%  summary

# Partial F test:
catod3 <- californiatod %>% mutate(region = ifelse(region =="LA" | region == "SD", "LA_SD", region))
lm(houseval ~ region, data=catod3)  %>% summary
anova(lm(houseval ~ region, data=catod3), lm(houseval ~ region, data=californiatod))

# Hypothesis testing of linear combination of coefficients
car::lht(model_2_7_8, "x2 = x7")
# Partial F test:H0:β2+β2=0
car::lht(lm(hours ~ married*women, data=chores), "women + married:women = 0")

# linear combination of coefficients
# The point estimate is β2^+β3^ In this case, our linear combination involves the sum rather than the difference between two coefficients, and the formula for estimating the standard error of the sum of two coefficients is:
# $\sqrt{\hat{\sigma^2_{\hat{\beta_2}}} + \hat{\sigma^2_{\hat{\beta_3}}} + 2\hat{cov}_{\hat{\beta_2}\hat{\beta_3}}}$
fit1 <- lm(hours ~ married*women, data=chores)
beta2 <- coef(fit1)["women"]
beta3 <- coef(fit1)["married:women"]
betas_vcov <- vcov(fit1)
se <- sqrt(betas_vcov["women", "women"] + betas_vcov["married:women", "married:women"] + 2 * betas_vcov["women", "married:women"])
(t_stat <- (beta2 + beta3)/se)

## Degrees of Freedom
dof <- fit1$df.residual

## compare t_stat to critical t-value
(t_crit <- qt(0.025, df=dof, lower.tail = F))
## OR find the corresponding p-value
(p_val <- 2 * pt(t_stat, lower.tail = F, df=dof))

# Partial F test on the nonlinear term
anova(lm(houseval ~ density, data=californiatod),lm(houseval ~ density + I(density^2), data=californiatod))
#To be on the safe side, enclose your tranformation in an I() function. This is not necessary for log transformation.

library(olsrr)
# leverage (hat)
leverage <- ols_leverage(lm_sfr)
ols_rsdlev_plot(lm_sfr)
# Cook's distance
ols_cooksd_chart(lm_sfr)
# DFFITS
ols_dffits_plot(lm_sfr)
# DFBETAS
ols_dfbetas_panel(lm_sfr)
# Heteroskedasticity
ols_rvsp_plot(lm_sfr)
ols_rsd_qqplot(lm_sfr)
# hypothesis test of normality of residuals
ols_norm_test(lm_sfr
# Test of Heteroskedasticity with Breusch-Pagan Test
ols_bp_test(lm_sfr)
#Heteroskedasticity-Consistent Standard Errors
# standard variance-covariance matrix
vcov0 <- vcov(lm_sfr)
vcov(model_2_7_8)
# convert to correlation
vcov0
# Heteroskedasticity-Consistent variance covariance matrix
require(car)
vcov_hc3 <- hccm(lm_sfr, type="hc3")
# In presence of Heteroskedasticity, vcov_hc3 is larger than vcov0, to redo hypothesis tests
# with the Heteroskedasticity-Consistent variance covariance matrix
if (!require(lmtest)) install.packages("lmtest") & library(lmtest)
coeftest(lm_sfr, vcov_hc3)
# All possible subset
sfrmodel <- lm(TOTALVAL ~ BLDGSQFT + YEARBUILT + GIS_ACRES + dpioneer + dfwy + dpark + dmax + dbikehq, data = taxlot_sfr)
(sfrmodel_all_subset <- ols_all_subset(sfrmodel))
# Best Subset Regression
ols_best_subset(model_2_7_8)
# Multicollinary with VIF
ols_vif_tol(lm_sfr)
## Stepwise Forward Regression
# based on p-value
(sfrmodel_stepfwd.p <- ols_step_forward(sfrmodel))
# based on AIC
(sfrmodel_stepfwd.aic <- ols_stepaic_forward(sfrmodel))
## Stepwise Backward Regression
# based on p-value
(sfrmodel_stepbwd.p <- ols_step_backward(sfrmodel))
# based on AIC
(sfrmodel_stepbwd.aic <- ols_stepaic_backward(sfrmodel))
## Step AIC regression
# Build regression model from a set of candidate predictor variables by entering and removing predictors based on Akaike Information Criteria, in a stepwise manner until there is no variable left to enter or remove any more. The model should include all the candidate predictor variables.
(sfrmodel_stepboth.aic <- ols_stepaic_both(sfrmodel))

# Cross Validation: CV assesses how the results of a model will generalize to an independent data set. It is mainly used in settings where the goal is prediction, and one wants to estimate how accurately a predictive model will perform in practice.
library(modelr)
library(purrr)
(taxlot_sfr_kcv <- taxlot_sfr %>% 
  modelr::crossv_kfold() %>% 
  mutate(model=map(train, ~lm(TOTALVAL~BLDGSQFT+YEARBUILT+GIS_ACRES+dpioneer+dfwy, data=.x)),
         rmse=map2_dbl(model, test, modelr::rmse),
         rsquare=map2_dbl(model, test, modelr::rsquare)))
taxlot_sfr_kcv %>% 
  summarise_at(c("rmse", "rsquare"), funs(mean))

## DID omitted
## Discrete Outcome: Count/Poisson Regression
require(MASS)
require(huxtable)
fit_lm <- lm(carb ~ mpg + qsec, data=mtcars)
fit_glm <- glm(carb ~ mpg + qsec, data=mtcars, family="poisson")
huxreg(OLS=fit_lm, Poisson=fit_glm)

fit_lm <- lm(am ~ qsec + hp, data=mtcars)
fit_glm <- glm(am ~ qsec + hp, data=mtcars, family=binomial("logit"))
huxreg(OLS=fit_lm, logit=fit_glm)

# log Likelihood
logLik(fit_glm)
fit_glm0 <- update(fit_glm, .~1)
logLik(fit_glm0)
## 'log Lik.' -21.61487 (df=1)
# pseudo R2
1 - logLik(fit_glm)/logLik(fit_glm0)
## 'log Lik.' 0.381052 (df=3)
# Interpretation of coefficients
# odds ratio
(odds <- exp(coef(fit_glm)))
#prob
odds/(1 + odds)

huxtable::huxreg(model_2_7_8, statistics = NULL)

library(leaps) # Load the package #
model_wf_subset <- regsubsets(log(y) ~X2 + X3 +X4 + X5 + X6 + X7 + X8 + X9, data=table_wf, nbest=10 ) # nbest is the number of models from each size #
summary(model_wf_subset) # Hard to read output from this #

## plot adjusted R square for each model ##
plot(model_wf_subset, scale='adjr2')
## can use Cp, r2 or bic for scale ##
plot(model_wf_subset, scale='bic')
plot(model_wf_subset, scale='Cp')

shapiro.test(rstudent(model_wf_reduce_log)) #If p-value is bigger, then no problem of non-normality #
shapiro.test(rstudent(model_wf_reduce_log))

table_wf_resi <- table_wf %>% mutate(student_residual=rstudent(model_wf_reduce_log))
ggpairs(data=table_wf_resi[c(10,3,4,6,7,8,9,11)])

table_wf_resi <- table_wf %>% mutate(student_residual=rstudent(model_wf_reduce_log))
ggpairs(data=table_wf_resi[c(10,3,4,7,11)])


Anova(model_wf_final)
vif(model_wf_final)

confint(model_wf_final, level=0.05/1) # Bonferroni joint confidence interval #

plot(model_wf_final, pch=16, col="blue")
#Create Partial Regression plots #
avPlots(model_wf_final)

confint(model_wf_437, level=0.05/1) # Bonferroni joint confidence interval #

plot(model_wf_437, pch=16, col="blue")
#Create Partial Regression plots #
avPlots(model_wf_437)


deviation <- table_wf$y-mean(table_wf$y)

# Predit_Power=1-(PRESS.stat/SST)
1-((MPV::PRESS(model_wf_final))/(deviation%*%deviation)) # Compute SST by multiplying two vectors #
# prediction power of full
1-((MPV::PRESS(model_wf_reduce_log))/(var(table_wf$y)*(nrow(table_wf)-1)))
# prediction power of 437
1-((MPV::PRESS(model_wf_437))/(var(table_wf$y)*(nrow(table_wf)-1)))
# prediction power of backward
1-((MPV::PRESS(model_wf_final))/(var(table_wf$y)*(nrow(table_wf)-1)))
1-((ols_press(model_wf_437896_log))/(var(log(table_wf$y))*(nrow(table_wf)-1)))